Sample records for biomedical signal processing

  1. Biomedical signal and image processing.

    PubMed

    Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro

    2011-01-01

    Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.

  2. Software for biomedical engineering signal processing laboratory experiments.

    PubMed

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  3. Biomedical signal acquisition, processing and transmission using smartphone

    NASA Astrophysics Data System (ADS)

    Roncagliolo, Pablo; Arredondo, Luis; González, Agustín

    2007-11-01

    This article describes technical aspects involved in the programming of a system of acquisition, processing and transmission of biomedical signals by using mobile devices. This task is aligned with the permanent development of new technologies for the diagnosis and sickness treatment, based on the feasibility of measuring continuously different variables as electrocardiographic signals, blood pressure, oxygen concentration, pulse or simply temperature. The contribution of this technology is settled on its portability and low cost, which allows its massive use. Specifically this work analyzes the feasibility of acquisition and the processing of signals from a standard smartphone. Work results allow to state that nowadays these equipments have enough processing capacity to execute signals acquisition systems. These systems along with external servers make it possible to imagine a near future where the possibility of making continuous measures of biomedical variables will not be restricted only to hospitals but will also begin to be more frequently used in the daily life and at home.

  4. BioSig: The Free and Open Source Software Library for Biomedical Signal Processing

    PubMed Central

    Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227

  5. BioSig: the free and open source software library for biomedical signal processing.

    PubMed

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  6. A fast discrete S-transform for biomedical signal processing.

    PubMed

    Brown, Robert A; Frayne, Richard

    2008-01-01

    Determining the frequency content of a signal is a basic operation in signal and image processing. The S-transform provides both the true frequency and globally referenced phase measurements characteristic of the Fourier transform and also generates local spectra, as does the wavelet transform. Due to this combination, the S-transform has been successfully demonstrated in a variety of biomedical signal and image processing tasks. However, the computational demands of the S-transform have limited its application in medicine to this point in time. This abstract introduces the fast S-transform, a more efficient discrete implementation of the classic S-transform with dramatically reduced computational requirements.

  7. Noise-assisted data processing with empirical mode decomposition in biomedical signals.

    PubMed

    Karagiannis, Alexandros; Constantinou, Philip

    2011-01-01

    In this paper, a methodology is described in order to investigate the performance of empirical mode decomposition (EMD) in biomedical signals, and especially in the case of electrocardiogram (ECG). Synthetic ECG signals corrupted with white Gaussian noise are employed and time series of various lengths are processed with EMD in order to extract the intrinsic mode functions (IMFs). A statistical significance test is implemented for the identification of IMFs with high-level noise components and their exclusion from denoising procedures. Simulation campaign results reveal that a decrease of processing time is accomplished with the introduction of preprocessing stage, prior to the application of EMD in biomedical time series. Furthermore, the variation in the number of IMFs according to the type of the preprocessing stage is studied as a function of SNR and time-series length. The application of the methodology in MIT-BIH ECG records is also presented in order to verify the findings in real ECG signals.

  8. Eventogram: A Visual Representation of Main Events in Biomedical Signals.

    PubMed

    Elgendi, Mohamed

    2016-09-22

    Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.

  9. A low-cost biomedical signal transceiver based on a Bluetooth wireless system.

    PubMed

    Fazel-Rezai, Reza; Pauls, Mark; Slawinski, David

    2007-01-01

    Most current wireless biomedical signal transceivers use range-limiting communication. This work presents a low-cost biomedical signal transceiver that uses Bluetooth wireless technology. The design is implemented in a modular form to be adaptable to different types of biomedical signals. The signal front end obtains and processes incoming signals, which are then transmitted via a microcontroller and wireless module. Near real-time receive software in LabVIEW was developed to demonstrate the system capability. The completed transmitter prototype successfully transmits ECG signals, and is able to simultaneously send multiple signals. The sampling rate of the transmitter is fast enough to send up to thirteen ECG signals simultaneously, with an error rate below 0.1% for transmission exceeding 65 meters. A low-cost wireless biomedical transceiver has many applications, such as real-time monitoring of patients with a known condition in non-clinical settings.

  10. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  11. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  12. An ultra low energy biomedical signal processing system operating at near-threshold.

    PubMed

    Hulzink, J; Konijnenburg, M; Ashouei, M; Breeschoten, A; Berset, T; Huisken, J; Stuyt, J; de Groot, H; Barat, F; David, J; Van Ginderdeuren, J

    2011-12-01

    This paper presents a voltage-scalable digital signal processing system designed for the use in a wireless sensor node (WSN) for ambulatory monitoring of biomedical signals. To fulfill the requirements of ambulatory monitoring, power consumption, which directly translates to the WSN battery lifetime and size, must be kept as low as possible. The proposed processing platform is an event-driven system with resources to run applications with different degrees of complexity in an energy-aware way. The architecture uses effective system partitioning to enable duty cycling, single instruction multiple data (SIMD) instructions, power gating, voltage scaling, multiple clock domains, multiple voltage domains, and extensive clock gating. It provides an alternative processing platform where the power and performance can be scaled to adapt to the application need. A case study on a continuous wavelet transform (CWT)-based heart-beat detection shows that the platform not only preserves the sensitivity and positive predictivity of the algorithm but also achieves the lowest energy/sample for ElectroCardioGram (ECG) heart-beat detection publicly reported today.

  13. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-11-05

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG).

  14. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  15. Bioinspired Polarization Imaging Sensors: From Circuits and Optics to Signal Processing Algorithms and Biomedical Applications

    PubMed Central

    York, Timothy; Powell, Samuel B.; Gao, Shengkui; Kahan, Lindsey; Charanya, Tauseef; Saha, Debajit; Roberts, Nicholas W.; Cronin, Thomas W.; Marshall, Justin; Achilefu, Samuel; Lake, Spencer P.; Raman, Baranidharan; Gruev, Viktor

    2015-01-01

    In this paper, we present recent work on bioinspired polarization imaging sensors and their applications in biomedicine. In particular, we focus on three different aspects of these sensors. First, we describe the electro–optical challenges in realizing a bioinspired polarization imager, and in particular, we provide a detailed description of a recent low-power complementary metal–oxide–semiconductor (CMOS) polarization imager. Second, we focus on signal processing algorithms tailored for this new class of bioinspired polarization imaging sensors, such as calibration and interpolation. Third, the emergence of these sensors has enabled rapid progress in characterizing polarization signals and environmental parameters in nature, as well as several biomedical areas, such as label-free optical neural recording, dynamic tissue strength analysis, and early diagnosis of flat cancerous lesions in a murine colorectal tumor model. We highlight results obtained from these three areas and discuss future applications for these sensors. PMID:26538682

  16. Interpretation of the Lempel-Ziv complexity measure in the context of biomedical signal analysis.

    PubMed

    Aboy, Mateo; Hornero, Roberto; Abásolo, Daniel; Alvarez, Daniel

    2006-11-01

    Lempel-Ziv complexity (LZ) and derived LZ algorithms have been extensively used to solve information theoretic problems such as coding and lossless data compression. In recent years, LZ has been widely used in biomedical applications to estimate the complexity of discrete-time signals. Despite its popularity as a complexity measure for biosignal analysis, the question of LZ interpretability and its relationship to other signal parameters and to other metrics has not been previously addressed. We have carried out an investigation aimed at gaining a better understanding of the LZ complexity itself, especially regarding its interpretability as a biomedical signal analysis technique. Our results indicate that LZ is particularly useful as a scalar metric to estimate the bandwidth of random processes and the harmonic variability in quasi-periodic signals.

  17. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    PubMed

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  18. Defense Applications of Signal Processing

    DTIC Science & Technology

    1999-08-27

    class of multiscale autoregressive moving average (MARMA) processes. These are generalisations of ARMA models in time series analysis , and they contain...including the two theoretical sinusoidal components. Analysis of the amplitude and frequency time series provided some novel insight into the real...communication channels, underwater acoustic signals, radar systems , economic time series and biomedical signals [7]. The alpha stable (aS) distribution has

  19. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. [Application of the mixed programming with Labview and Matlab in biomedical signal analysis].

    PubMed

    Yu, Lu; Zhang, Yongde; Sha, Xianzheng

    2011-01-01

    This paper introduces the method of mixed programming with Labview and Matlab, and applies this method in a pulse wave pre-processing and feature detecting system. The method has been proved suitable, efficient and accurate, which has provided a new kind of approach for biomedical signal analysis.

  1. Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis.

    PubMed

    Akwei-Sekyere, Samuel

    2015-01-01

    The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio.

  2. Microcontroller-based wireless recorder for biomedical signals.

    PubMed

    Chien, C-N; Hsu, H-W; Jang, J-K; Rau, C-L; Jaw, F-S

    2005-01-01

    A portable multichannel system is described for the recording of biomedical signals wirelessly. Instead of using the conversional time-division analog-modulation method, the technique of digital multiplexing was applied to increase the number of signal channels to 4. Detailed design considerations and functional allocation of the system is discussed. The frontend unit was modularly designed to condition the input signal in an optimal manner. Then, the microcontroller handled the tasks of data conversion, wireless transmission, as well as providing the ability of simple preprocessing such as waveform averaging or rectification. The low-power nature of this microcontroller affords the benefit of battery operation and hence, patient isolation of the system. Finally, a single-chip receiver, which compatible with the RF transmitter of the microcontroller, was used to implement a compact interface with the host computer. An application of this portable recorder for low-back pain studies is shown. This device can simultaneously record one ECG and two surface EMG wirelessly, thus, is helpful in relieving patients' anxiety devising clinical measurement. Such an approach, microcontroller-based wireless measurement, could be an important trend for biomedical instrumentation and we help that this paper could be useful for other colleagues.

  3. Telemedicine optoelectronic biomedical data processing system

    NASA Astrophysics Data System (ADS)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  4. Accelerating Biomedical Signal Processing Using GPU: A Case Study of Snore Sound Feature Extraction.

    PubMed

    Guo, Jian; Qian, Kun; Zhang, Gongxuan; Xu, Huijie; Schuller, Björn

    2017-12-01

    The advent of 'Big Data' and 'Deep Learning' offers both, a great challenge and a huge opportunity for personalised health-care. In machine learning-based biomedical data analysis, feature extraction is a key step for 'feeding' the subsequent classifiers. With increasing numbers of biomedical data, extracting features from these 'big' data is an intensive and time-consuming task. In this case study, we employ a Graphics Processing Unit (GPU) via Python to extract features from a large corpus of snore sound data. Those features can subsequently be imported into many well-known deep learning training frameworks without any format processing. The snore sound data were collected from several hospitals (20 subjects, with 770-990 MB per subject - in total 17.20 GB). Experimental results show that our GPU-based processing significantly speeds up the feature extraction phase, by up to seven times, as compared to the previous CPU system.

  5. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  6. Optimal wavelets for biomedical signal compression.

    PubMed

    Nielsen, Mogens; Kamavuako, Ernest Nlandu; Andersen, Michael Midtgaard; Lucas, Marie-Françoise; Farina, Dario

    2006-07-01

    Signal compression is gaining importance in biomedical engineering due to the potential applications in telemedicine. In this work, we propose a novel scheme of signal compression based on signal-dependent wavelets. To adapt the mother wavelet to the signal for the purpose of compression, it is necessary to define (1) a family of wavelets that depend on a set of parameters and (2) a quality criterion for wavelet selection (i.e., wavelet parameter optimization). We propose the use of an unconstrained parameterization of the wavelet for wavelet optimization. A natural performance criterion for compression is the minimization of the signal distortion rate given the desired compression rate. For coding the wavelet coefficients, we adopted the embedded zerotree wavelet coding algorithm, although any coding scheme may be used with the proposed wavelet optimization. As a representative example of application, the coding/encoding scheme was applied to surface electromyographic signals recorded from ten subjects. The distortion rate strongly depended on the mother wavelet (for example, for 50% compression rate, optimal wavelet, mean+/-SD, 5.46+/-1.01%; worst wavelet 12.76+/-2.73%). Thus, optimization significantly improved performance with respect to previous approaches based on classic wavelets. The algorithm can be applied to any signal type since the optimal wavelet is selected on a signal-by-signal basis. Examples of application to ECG and EEG signals are also reported.

  7. Surface Electromyography Signal Processing and Classification Techniques

    PubMed Central

    Chowdhury, Rubana H.; Reaz, Mamun B. I.; Ali, Mohd Alauddin Bin Mohd; Bakar, Ashrif A. A.; Chellappan, Kalaivani; Chang, Tae. G.

    2013-01-01

    Electromyography (EMG) signals are becoming increasingly important in many applications, including clinical/biomedical, prosthesis or rehabilitation devices, human machine interactions, and more. However, noisy EMG signals are the major hurdles to be overcome in order to achieve improved performance in the above applications. Detection, processing and classification analysis in electromyography (EMG) is very desirable because it allows a more standardized and precise evaluation of the neurophysiological, rehabitational and assistive technological findings. This paper reviews two prominent areas; first: the pre-processing method for eliminating possible artifacts via appropriate preparation at the time of recording EMG signals, and second: a brief explanation of the different methods for processing and classifying EMG signals. This study then compares the numerous methods of analyzing EMG signals, in terms of their performance. The crux of this paper is to review the most recent developments and research studies related to the issues mentioned above. PMID:24048337

  8. Interpretation of the auto-mutual information rate of decrease in the context of biomedical signal analysis. Application to electroencephalogram recordings.

    PubMed

    Escudero, Javier; Hornero, Roberto; Abásolo, Daniel

    2009-02-01

    The mutual information (MI) is a measure of both linear and nonlinear dependences. It can be applied to a time series and a time-delayed version of the same sequence to compute the auto-mutual information function (AMIF). Moreover, the AMIF rate of decrease (AMIFRD) with increasing time delay in a signal is correlated with its entropy and has been used to characterize biomedical data. In this paper, we aimed at gaining insight into the dependence of the AMIFRD on several signal processing concepts and at illustrating its application to biomedical time series analysis. Thus, we have analysed a set of synthetic sequences with the AMIFRD. The results show that the AMIF decreases more quickly as bandwidth increases and that the AMIFRD becomes more negative as there is more white noise contaminating the time series. Additionally, this metric detected changes in the nonlinear dynamics of a signal. Finally, in order to illustrate the analysis of real biomedical signals with the AMIFRD, this metric was applied to electroencephalogram (EEG) signals acquired with eyes open and closed and to ictal and non-ictal intracranial EEG recordings.

  9. Snore related signals processing in a private cloud computing system.

    PubMed

    Qian, Kun; Guo, Jian; Xu, Huijie; Zhu, Zhaomeng; Zhang, Gongxuan

    2014-09-01

    Snore related signals (SRS) have been demonstrated to carry important information about the obstruction site and degree in the upper airway of Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) patients in recent years. To make this acoustic signal analysis method more accurate and robust, big SRS data processing is inevitable. As an emerging concept and technology, cloud computing has motivated numerous researchers and engineers to exploit applications both in academic and industry field, which could have an ability to implement a huge blue print in biomedical engineering. Considering the security and transferring requirement of biomedical data, we designed a system based on private cloud computing to process SRS. Then we set the comparable experiments of processing a 5-hour audio recording of an OSAHS patient by a personal computer, a server and a private cloud computing system to demonstrate the efficiency of the infrastructure we proposed.

  10. An Ultra-Low Power Turning Angle Based Biomedical Signal Compression Engine with Adaptive Threshold Tuning.

    PubMed

    Zhou, Jun; Wang, Chao

    2017-08-06

    Intelligent sensing is drastically changing our everyday life including healthcare by biomedical signal monitoring, collection, and analytics. However, long-term healthcare monitoring generates tremendous data volume and demands significant wireless transmission power, which imposes a big challenge for wearable healthcare sensors usually powered by batteries. Efficient compression engine design to reduce wireless transmission data rate with ultra-low power consumption is essential for wearable miniaturized healthcare sensor systems. This paper presents an ultra-low power biomedical signal compression engine for healthcare data sensing and analytics in the era of big data and sensor intelligence. It extracts the feature points of the biomedical signal by window-based turning angle detection. The proposed approach has low complexity and thus low power consumption while achieving a large compression ratio (CR) and good quality of reconstructed signal. Near-threshold design technique is adopted to further reduce the power consumption on the circuit level. Besides, the angle threshold for compression can be adaptively tuned according to the error between the original signal and reconstructed signal to address the variation of signal characteristics from person to person or from channel to channel to meet the required signal quality with optimal CR. For demonstration, the proposed biomedical compression engine has been used and evaluated for ECG compression. It achieves an average (CR) of 71.08% and percentage root-mean-square difference (PRD) of 5.87% while consuming only 39 nW. Compared to several state-of-the-art ECG compression engines, the proposed design has significantly lower power consumption while achieving similar CRD and PRD, making it suitable for long-term wearable miniaturized sensor systems to sense and collect healthcare data for remote data analytics.

  11. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    PubMed

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  12. Cross-Approximate Entropy parallel computation on GPUs for biomedical signal analysis. Application to MEG recordings.

    PubMed

    Martínez-Zarzuela, Mario; Gómez, Carlos; Díaz-Pernas, Francisco Javier; Fernández, Alberto; Hornero, Roberto

    2013-10-01

    Cross-Approximate Entropy (Cross-ApEn) is a useful measure to quantify the statistical dissimilarity of two time series. In spite of the advantage of Cross-ApEn over its one-dimensional counterpart (Approximate Entropy), only a few studies have applied it to biomedical signals, mainly due to its high computational cost. In this paper, we propose a fast GPU-based implementation of the Cross-ApEn that makes feasible its use over a large amount of multidimensional data. The scheme followed is fully scalable, thus maximizes the use of the GPU despite of the number of neural signals being processed. The approach consists in processing many trials or epochs simultaneously, with independence of its origin. In the case of MEG data, these trials can proceed from different input channels or subjects. The proposed implementation achieves an average speedup greater than 250× against a CPU parallel version running on a processor containing six cores. A dataset of 30 subjects containing 148 MEG channels (49 epochs of 1024 samples per channel) can be analyzed using our development in about 30min. The same processing takes 5 days on six cores and 15 days when running on a single core. The speedup is much larger if compared to a basic sequential Matlab(®) implementation, that would need 58 days per subject. To our knowledge, this is the first contribution of Cross-ApEn measure computation using GPUs. This study demonstrates that this hardware is, to the day, the best option for the signal processing of biomedical data with Cross-ApEn. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. An Ultra-Low Power Turning Angle Based Biomedical Signal Compression Engine with Adaptive Threshold Tuning

    PubMed Central

    Zhou, Jun; Wang, Chao

    2017-01-01

    Intelligent sensing is drastically changing our everyday life including healthcare by biomedical signal monitoring, collection, and analytics. However, long-term healthcare monitoring generates tremendous data volume and demands significant wireless transmission power, which imposes a big challenge for wearable healthcare sensors usually powered by batteries. Efficient compression engine design to reduce wireless transmission data rate with ultra-low power consumption is essential for wearable miniaturized healthcare sensor systems. This paper presents an ultra-low power biomedical signal compression engine for healthcare data sensing and analytics in the era of big data and sensor intelligence. It extracts the feature points of the biomedical signal by window-based turning angle detection. The proposed approach has low complexity and thus low power consumption while achieving a large compression ratio (CR) and good quality of reconstructed signal. Near-threshold design technique is adopted to further reduce the power consumption on the circuit level. Besides, the angle threshold for compression can be adaptively tuned according to the error between the original signal and reconstructed signal to address the variation of signal characteristics from person to person or from channel to channel to meet the required signal quality with optimal CR. For demonstration, the proposed biomedical compression engine has been used and evaluated for ECG compression. It achieves an average (CR) of 71.08% and percentage root-mean-square difference (PRD) of 5.87% while consuming only 39 nW. Compared to several state-of-the-art ECG compression engines, the proposed design has significantly lower power consumption while achieving similar CRD and PRD, making it suitable for long-term wearable miniaturized sensor systems to sense and collect healthcare data for remote data analytics. PMID:28783079

  14. Graphene oxide based contacts as probes of biomedical signals

    NASA Astrophysics Data System (ADS)

    Hallfors, N. G.; Devarajan, A.; Farhat, I. A. H.; Abdurahman, A.; Liao, K.; Gater, D. L.; Elnaggar, M. I.; Isakovic, A. F.

    We have developed a series of graphene oxide (GOx) on polymer contacts and have demonstrated these to be useful for collection of standard biomedically relevant signals, such as electrocardiogram (ECG). The process is wet solution-based and allows for control and tuning of the basic physical parameters of GOx, such as electrical and optical properties, simply by choosing the number of GOx layers. Our GOx characterization measurements show spectral (FTIR, XPS, IR absorbance) features most relevant to such performance, and point towards the likely explanations about the mechanisms for controlling the physical properties relevant for the contact performance. Structural (X-ray topography) and surface characterization (AFM, SEM) indicates to what degree these contacts can be considered homogeneous and therefore provide information on yield and repeatability. We compare the ECG signals recorded by standard commercial probes (Ag/AgCl) and GOx probes, displaying minor differences the solution to which may lead to a whole new way we perform ECG data collection, including wearable electronics and IoT friendly ECG monitoring. We acknowledge support from Mubadala-SRC AC4ES and from SRC 2011-KJ-2190. We thank J. B. Warren and G. L. Carr (BNL) for assistance.

  15. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  16. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  17. Noninvasive evaluation of mental stress using by a refined rough set technique based on biomedical signals.

    PubMed

    Liu, Tung-Kuan; Chen, Yeh-Peng; Hou, Zone-Yuan; Wang, Chao-Chih; Chou, Jyh-Horng

    2014-06-01

    Evaluating and treating of stress can substantially benefits to people with health problems. Currently, mental stress evaluated using medical questionnaires. However, the accuracy of this evaluation method is questionable because of variations caused by factors such as cultural differences and individual subjectivity. Measuring of biomedical signals is an effective method for estimating mental stress that enables this problem to be overcome. However, the relationship between the levels of mental stress and biomedical signals remain poorly understood. A refined rough set algorithm is proposed to determine the relationship between mental stress and biomedical signals, this algorithm combines rough set theory with a hybrid Taguchi-genetic algorithm, called RS-HTGA. Two parameters were used for evaluating the performance of the proposed RS-HTGA method. A dataset obtained from a practice clinic comprising 362 cases (196 male, 166 female) was adopted to evaluate the performance of the proposed approach. The empirical results indicate that the proposed method can achieve acceptable accuracy in medical practice. Furthermore, the proposed method was successfully used to identify the relationship between mental stress levels and bio-medical signals. In addition, the comparison between the RS-HTGA and a support vector machine (SVM) method indicated that both methods yield good results. The total averages for sensitivity, specificity, and precision were greater than 96%, the results indicated that both algorithms produced highly accurate results, but a substantial difference in discrimination existed among people with Phase 0 stress. The SVM algorithm shows 89% and the RS-HTGA shows 96%. Therefore, the RS-HTGA is superior to the SVM algorithm. The kappa test results for both algorithms were greater than 0.936, indicating high accuracy and consistency. The area under receiver operating characteristic curve for both the RS-HTGA and a SVM method were greater than 0.77, indicating

  18. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  19. Large-scale combining signals from both biomedical literature and the FDA Adverse Event Reporting System (FAERS) to improve post-marketing drug safety signal detection.

    PubMed

    Xu, Rong; Wang, QuanQiu

    2014-01-15

    Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance.

  20. Large-scale combining signals from both biomedical literature and the FDA Adverse Event Reporting System (FAERS) to improve post-marketing drug safety signal detection

    PubMed Central

    2014-01-01

    Background Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. Results The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. Conclusions We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance. PMID:24428898

  1. Software system for data management and distributed processing of multichannel biomedical signals.

    PubMed

    Franaszczuk, P J; Jouny, C C

    2004-01-01

    The presented software is designed for efficient utilization of cluster of PC computers for signal analysis of multichannel physiological data. The system consists of three main components: 1) a library of input and output procedures, 2) a database storing additional information about location in a storage system, 3) a user interface for selecting data for analysis, choosing programs for analysis, and distributing computing and output data on cluster nodes. The system allows for processing multichannel time series data in multiple binary formats. The description of data format, channels and time of recording are included in separate text files. Definition and selection of multiple channel montages is possible. Epochs for analysis can be selected both manually and automatically. Implementation of a new signal processing procedures is possible with a minimal programming overhead for the input/output processing and user interface. The number of nodes in cluster used for computations and amount of storage can be changed with no major modification to software. Current implementations include the time-frequency analysis of multiday, multichannel recordings of intracranial EEG of epileptic patients as well as evoked response analyses of repeated cognitive tasks.

  2. Strategies for Derisking Translational Processes for Biomedical Technologies.

    PubMed

    Abou-El-Enein, Mohamed; Duda, Georg N; Gruskin, Elliott A; Grainger, David W

    2017-02-01

    Inefficient translational processes for technology-oriented biomedical research have led to some prominent and frequent failures in the development of many leading drug candidates, several designated investigational drugs, and some medical devices, as well as documented patient harm and postmarket product withdrawals. Derisking this process, particularly in the early stages, should increase translational efficiency and streamline resource utilization, especially in an academic setting. In this opinion article, we identify a 12-step guideline for reducing risks typically associated with translating medical technologies as they move toward prototypes, preclinical proof of concept, and possible clinical testing. Integrating the described 12-step process should prove valuable for improving how early-stage academic biomedical concepts are cultivated, culled, and manicured toward intended clinical applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Compensatory neurofuzzy model for discrete data classification in biomedical

    NASA Astrophysics Data System (ADS)

    Ceylan, Rahime

    2015-03-01

    Biomedical data is separated to two main sections: signals and discrete data. So, studies in this area are about biomedical signal classification or biomedical discrete data classification. There are artificial intelligence models which are relevant to classification of ECG, EMG or EEG signals. In same way, in literature, many models exist for classification of discrete data taken as value of samples which can be results of blood analysis or biopsy in medical process. Each algorithm could not achieve high accuracy rate on classification of signal and discrete data. In this study, compensatory neurofuzzy network model is presented for classification of discrete data in biomedical pattern recognition area. The compensatory neurofuzzy network has a hybrid and binary classifier. In this system, the parameters of fuzzy systems are updated by backpropagation algorithm. The realized classifier model is conducted to two benchmark datasets (Wisconsin Breast Cancer dataset and Pima Indian Diabetes dataset). Experimental studies show that compensatory neurofuzzy network model achieved 96.11% accuracy rate in classification of breast cancer dataset and 69.08% accuracy rate was obtained in experiments made on diabetes dataset with only 10 iterations.

  4. Semantic biomedical resource discovery: a Natural Language Processing framework.

    PubMed

    Sfakianaki, Pepi; Koumakis, Lefteris; Sfakianakis, Stelios; Iatraki, Galatia; Zacharioudakis, Giorgos; Graf, Norbert; Marias, Kostas; Tsiknakis, Manolis

    2015-09-30

    A plethora of publicly available biomedical resources do currently exist and are constantly increasing at a fast rate. In parallel, specialized repositories are been developed, indexing numerous clinical and biomedical tools. The main drawback of such repositories is the difficulty in locating appropriate resources for a clinical or biomedical decision task, especially for non-Information Technology expert users. In parallel, although NLP research in the clinical domain has been active since the 1960s, progress in the development of NLP applications has been slow and lags behind progress in the general NLP domain. The aim of the present study is to investigate the use of semantics for biomedical resources annotation with domain specific ontologies and exploit Natural Language Processing methods in empowering the non-Information Technology expert users to efficiently search for biomedical resources using natural language. A Natural Language Processing engine which can "translate" free text into targeted queries, automatically transforming a clinical research question into a request description that contains only terms of ontologies, has been implemented. The implementation is based on information extraction techniques for text in natural language, guided by integrated ontologies. Furthermore, knowledge from robust text mining methods has been incorporated to map descriptions into suitable domain ontologies in order to ensure that the biomedical resources descriptions are domain oriented and enhance the accuracy of services discovery. The framework is freely available as a web application at ( http://calchas.ics.forth.gr/ ). For our experiments, a range of clinical questions were established based on descriptions of clinical trials from the ClinicalTrials.gov registry as well as recommendations from clinicians. Domain experts manually identified the available tools in a tools repository which are suitable for addressing the clinical questions at hand, either

  5. Advances in biomedical engineering and biotechnology during 2013-2014.

    PubMed

    Liu, Feng; Wang, Ying; Burkhart, Timothy A; González Penedo, Manuel Francisco; Ma, Shaodong

    2014-01-01

    The 3rd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2014), held in Beijing from the 25th to the 28th of September 2014, is an annual conference that intends to provide an opportunity for researchers and practitioners around the world to present the most recent advances and future challenges in the fields of biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, amongst others. The papers published in this issue are selected from this conference, which witnesses the advances in biomedical engineering and biotechnology during 2013-2014.

  6. Bluetooth telemedicine processor for multichannel biomedical signal transmission via mobile cellular networks.

    PubMed

    Rasid, Mohd Fadlee A; Woodward, Bryan

    2005-03-01

    One of the emerging issues in m-Health is how best to exploit the mobile communications technologies that are now almost globally available. The challenge is to produce a system to transmit a patient's biomedical signals directly to a hospital for monitoring or diagnosis, using an unmodified mobile telephone. The paper focuses on the design of a processor, which samples signals from sensors on the patient. It then transmits digital data over a Bluetooth link to a mobile telephone that uses the General Packet Radio Service. The modular design adopted is intended to provide a "future-proofed" system, whose functionality may be upgraded by modifying the software.

  7. New frontiers in biomedical science and engineering during 2014-2015.

    PubMed

    Liu, Feng; Lee, Dong-Hoon; Lagoa, Ricardo; Kumar, Sandeep

    2015-01-01

    The International Conference on Biomedical Engineering and Biotechnology (ICBEB) is an international meeting held once a year. This, the fourth International Conference on Biomedical Engineering and Biotechnology (ICBEB2015), will be held in Shanghai, China, during August 18th-21st, 2015. This annual conference intends to provide an opportunity for researchers and practitioners at home and abroad to present the most recent frontiers and future challenges in the fields of biomedical science, biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, etc. The papers published in this issue are selected from this Conference, which witness the advances in biomedical engineering and biotechnology during 2014-2015.

  8. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis.

    PubMed

    Azami, Hamed; Fernández, Alberto; Escudero, Javier

    2017-11-01

    Multiscale entropy (MSE) has been a prevalent algorithm to quantify the complexity of biomedical time series. Recent developments in the field have tried to alleviate the problem of undefined MSE values for short signals. Moreover, there has been a recent interest in using other statistical moments than the mean, i.e., variance, in the coarse-graining step of the MSE. Building on these trends, here we introduce the so-called refined composite multiscale fuzzy entropy based on the standard deviation (RCMFE σ ) and mean (RCMFE μ ) to quantify the dynamical properties of spread and mean, respectively, over multiple time scales. We demonstrate the dependency of the RCMFE σ and RCMFE μ , in comparison with other multiscale approaches, on several straightforward signal processing concepts using a set of synthetic signals. The results evidenced that the RCMFE σ and RCMFE μ values are more stable and reliable than the classical multiscale entropy ones. We also inspect the ability of using the standard deviation as well as the mean in the coarse-graining process using magnetoencephalograms in Alzheimer's disease and publicly available electroencephalograms recorded from focal and non-focal areas in epilepsy. Our results indicated that when the RCMFE μ cannot distinguish different types of dynamics of a particular time series at some scale factors, the RCMFE σ may do so, and vice versa. The results showed that RCMFE σ -based features lead to higher classification accuracies in comparison with the RCMFE μ -based ones. We also made freely available all the Matlab codes used in this study at http://dx.doi.org/10.7488/ds/1477 .

  9. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    PubMed

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  10. An enhanced approach for biomedical image restoration using image fusion techniques

    NASA Astrophysics Data System (ADS)

    Karam, Ghada Sabah; Abbas, Fatma Ismail; Abood, Ziad M.; Kadhim, Kadhim K.; Karam, Nada S.

    2018-05-01

    Biomedical image is generally noisy and little blur due to the physical mechanisms of the acquisition process, so one of the common degradations in biomedical image is their noise and poor contrast. The idea of biomedical image enhancement is to improve the quality of the image for early diagnosis. In this paper we are using Wavelet Transformation to remove the Gaussian noise from biomedical images: Positron Emission Tomography (PET) image and Radiography (Radio) image, in different color spaces (RGB, HSV, YCbCr), and we perform the fusion of the denoised images resulting from the above denoising techniques using add image method. Then some quantive performance metrics such as signal -to -noise ratio (SNR), peak signal-to-noise ratio (PSNR), and Mean Square Error (MSE), etc. are computed. Since this statistical measurement helps in the assessment of fidelity and image quality. The results showed that our approach can be applied of Image types of color spaces for biomedical images.

  11. Biomedical image analysis and processing in clouds

    NASA Astrophysics Data System (ADS)

    Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John

    2013-10-01

    Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.

  12. Biomedical digital assistant for ubiquitous healthcare.

    PubMed

    Lee, Tae-Soo; Hong, Joo-Hyun; Cho, Myeong-Chan

    2007-01-01

    The concept of ubiquitous healthcare service, which emerged as one of measures to solve healthcare problems in aged society, means that patients can receive services such as prevention, diagnosis, therapy and prognosis management at any time and in any place with the help of advanced information and communication technology. This service requires not only biomedical digital assistant that can monitor continuously the patients' health condition regardless of time and place, but also wired and wireless communication devices and telemedicine servers that provide doctors with data on patients' present health condition. In order to implement a biomedical digital assistant that is portable and wearable to patients, the present study developed a device that minimizes size, weight and power consumption, measures ECG and PPG signals, and even monitors moving patients' state. The biomedical sensor with the function of wireless communication was designed to be highly portable and wearable, to be operable 24 hours with small-size batteries, and to monitor the subject's heart rate, step count and respiratory rate in his daily life. The biomedical signal receiving device was implemented in two forms, PDA and cellular phone. The movement monitoring device embedded in the battery pack of a cellular phone does not have any problem in operating 24 hours, but the real-time biomedical signal receiving device implemented with PDA operated up to 6 hours due to the limited battery capacity of PDA. This problem is expected to be solved by reducing wireless communication load through improving the processing and storage functions of the sensor. The developed device can transmit a message on the patient's emergency to the remote server through the cellular phone network, and is expected to play crucial roles in the health management of chronic-aged patients in their daily life.

  13. Micro/Nanostructured Films and Adhesives for Biomedical Applications.

    PubMed

    Lee, Jungkyu K; Kang, Sung Min; Yang, Sung Ho; Cho, Woo Kyung

    2015-12-01

    The advanced technologies available for micro/nanofabrication have opened new avenues for interdisciplinary approaches to solve the unmet medical needs of regenerative medicine and biomedical devices. This review highlights the recent developments in micro/nanostructured adhesives and films for biomedical applications, including waterproof seals for wounds or surgery sites, drug delivery, sensing human body signals, and optical imaging of human tissues. We describe in detail the fabrication processes required to prepare the adhesives and films, such as tape-based adhesives, nanofilms, and flexible and stretchable film-based electronic devices. We also discuss their biomedical functions, performance in vitro and in vivo, and the future research needed to improve the current systems.

  14. Camera systems in human motion analysis for biomedical applications

    NASA Astrophysics Data System (ADS)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  15. Astronaut Kenneth Reightler processes biomedical samples in SPACEHAB

    NASA Image and Video Library

    1994-02-09

    STS060-301-003 (3-11 Feb 1994) --- Astronaut Kenneth S. Reightler, STS-60 pilot, processes biomedical samples in a centrifuge aboard the SPACEHAB module. Reightler joined four other NASA astronauts and a Russian cosmonaut for eight days of research aboard the Space Shuttle Discovery.

  16. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors

    PubMed Central

    Castro-García, Juan A.; Lebrato-Vázquez, Clara

    2018-01-01

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive. PMID:29596394

  17. Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Abasolo, Daniel; Escudero, Javier

    2017-12-01

    We propose a novel complexity measure to overcome the deficiencies of the widespread and powerful multiscale entropy (MSE), including, MSE values may be undefined for short signals, and MSE is slow for real-time applications. We introduce multiscale dispersion entropy (DisEn-MDE) as a very fast and powerful method to quantify the complexity of signals. MDE is based on our recently developed DisEn, which has a computation cost of O(N), compared with O(N 2 ) for sample entropy used in MSE. We also propose the refined composite MDE (RCMDE) to improve the stability of MDE. We evaluate MDE, RCMDE, and refined composite MSE (RCMSE) on synthetic signals and three biomedical datasets. The MDE, RCMDE, and RCMSE methods show similar results, although the MDE and RCMDE are faster, lead to more stable results, and discriminate different types of physiological signals better than MSE and RCMSE. For noisy short and long time series, MDE and RCMDE are noticeably more stable than MSE and RCMSE, respectively. For short signals, MDE and RCMDE, unlike MSE and RCMSE, do not lead to undefined values. The proposed MDE and RCMDE are significantly faster than MSE and RCMSE, especially for long signals, and lead to larger differences between physiological conditions known to alter the complexity of the physiological recordings. MDE and RCMDE are expected to be useful for the analysis of physiological signals thanks to their ability to distinguish different types of dynamics. The MATLAB codes used in this paper are freely available at http://dx.doi.org/10.7488/ds/1982.

  18. Biomedical ground lead system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and verification tests for the biomedical ground lead system of Apollo biomedical monitors are presented. Major efforts were made to provide a low impedance path to ground, reduce noise and artifact of ECG signals, and limit the current flowing in the ground electrode of the system.

  19. Ultralow-power electronics for biomedical applications.

    PubMed

    Chandrakasan, Anantha P; Verma, Naveen; Daly, Denis C

    2008-01-01

    The electronics of a general biomedical device consist of energy delivery, analog-to-digital conversion, signal processing, and communication subsystems. Each of these blocks must be designed for minimum energy consumption. Specific design techniques, such as aggressive voltage scaling, dynamic power-performance management, and energy-efficient signaling, must be employed to adhere to the stringent energy constraint. The constraint itself is set by the energy source, so energy harvesting holds tremendous promise toward enabling sophisticated systems without straining user lifestyle. Further, once harvested, efficient delivery of the low-energy levels, as well as robust operation in the aggressive low-power modes, requires careful understanding and treatment of the specific design limitations that dominate this realm. We outline the performance and power constraints of biomedical devices, and present circuit techniques to achieve complete systems operating down to power levels of microwatts. In all cases, approaches that leverage advanced technology trends are emphasized.

  20. Pathophysiologic mechanisms of biomedical nanomaterials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Liming, E-mail: wangliming@ihep.ac.cn; Chen, Chunying, E-mail: chenchy@nanoctr.cn

    Nanomaterials (NMs) have been widespread used in biomedical fields, daily consuming, and even food industry. It is crucial to understand the safety and biomedical efficacy of NMs. In this review, we summarized the recent progress about the physiological and pathological effects of NMs from several levels: protein-nano interface, NM-subcellular structures, and cell–cell interaction. We focused on the detailed information of nano-bio interaction, especially about protein adsorption, intracellular trafficking, biological barriers, and signaling pathways as well as the associated mechanism mediated by nanomaterials. We also introduced related analytical methods that are meaningful and helpful for biomedical effect studies in the future.more » We believe that knowledge about pathophysiologic effects of NMs is not only significant for rational design of medical NMs but also helps predict their safety and further improve their applications in the future. - Highlights: • Rapid protein adsorption onto nanomaterials that affects biomedical effects • Nanomaterials and their interaction with biological membrane, intracellular trafficking and specific cellular effects • Nanomaterials and their interaction with biological barriers • The signaling pathways mediated by nanomaterials and related biomedical effects • Novel techniques for studying translocation and biomedical effects of NMs.« less

  1. Rotation covariant image processing for biomedical applications.

    PubMed

    Skibbe, Henrik; Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.

  2. Rotation Covariant Image Processing for Biomedical Applications

    PubMed Central

    Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences. PMID:23710255

  3. Generation of open biomedical datasets through ontology-driven transformation and integration processes.

    PubMed

    Carmen Legaz-García, María Del; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2016-06-03

    Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources, which makes difficult the integrated exploitation of such data. The Semantic Web paradigm offers a natural technological space for data integration and exploitation by generating content readable by machines. Linked Open Data is a Semantic Web initiative that promotes the publication and sharing of data in machine readable semantic formats. We present an approach for the transformation and integration of heterogeneous biomedical data with the objective of generating open biomedical datasets in Semantic Web formats. The transformation of the data is based on the mappings between the entities of the data schema and the ontological infrastructure that provides the meaning to the content. Our approach permits different types of mappings and includes the possibility of defining complex transformation patterns. Once the mappings are defined, they can be automatically applied to datasets to generate logically consistent content and the mappings can be reused in further transformation processes. The results of our research are (1) a common transformation and integration process for heterogeneous biomedical data; (2) the application of Linked Open Data principles to generate interoperable, open, biomedical datasets; (3) a software tool, called SWIT, that implements the approach. In this paper we also describe how we have applied SWIT in different biomedical scenarios and some lessons learned. We have presented an approach that is able to generate open biomedical repositories in Semantic Web formats. SWIT is able to apply the Linked Open Data principles in the generation of the datasets, so allowing for linking their content to external repositories and creating linked open datasets. SWIT datasets may contain data from multiple sources and schemas, thus becoming integrated datasets.

  4. Biomedical sensor design using analog compressed sensing

    NASA Astrophysics Data System (ADS)

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2015-05-01

    The main drawback of current healthcare systems is the location-specific nature of the system due to the use of fixed/wired biomedical sensors. Since biomedical sensors are usually driven by a battery, power consumption is the most important factor determining the life of a biomedical sensor. They are also restricted by size, cost, and transmission capacity. Therefore, it is important to reduce the load of sampling by merging the sampling and compression steps to reduce the storage usage, transmission times, and power consumption in order to expand the current healthcare systems to Wireless Healthcare Systems (WHSs). In this work, we present an implementation of a low-power biomedical sensor using analog Compressed Sensing (CS) framework for sparse biomedical signals that addresses both the energy and telemetry bandwidth constraints of wearable and wireless Body-Area Networks (BANs). This architecture enables continuous data acquisition and compression of biomedical signals that are suitable for a variety of diagnostic and treatment purposes. At the transmitter side, an analog-CS framework is applied at the sensing step before Analog to Digital Converter (ADC) in order to generate the compressed version of the input analog bio-signal. At the receiver side, a reconstruction algorithm based on Restricted Isometry Property (RIP) condition is applied in order to reconstruct the original bio-signals form the compressed bio-signals with high probability and enough accuracy. We examine the proposed algorithm with healthy and neuropathy surface Electromyography (sEMG) signals. The proposed algorithm achieves a good level for Average Recognition Rate (ARR) at 93% and reconstruction accuracy at 98.9%. In addition, The proposed architecture reduces total computation time from 32 to 11.5 seconds at sampling-rate=29 % of Nyquist rate, Percentage Residual Difference (PRD)=26 %, Root Mean Squared Error (RMSE)=3 %.

  5. Optical signal processing

    NASA Technical Reports Server (NTRS)

    Casasent, D.

    1978-01-01

    The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.

  6. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  7. IEEE International Symposium on Biomedical Imaging.

    PubMed

    2017-01-01

    The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.

  8. An Analog Circuit Approximation of the Discrete Wavelet Transform for Ultra Low Power Signal Processing in Wearable Sensor Nodes

    PubMed Central

    Casson, Alexander J.

    2015-01-01

    Ultra low power signal processing is an essential part of all sensor nodes, and particularly so in emerging wearable sensors for biomedical applications. Analog signal processing has an important role in these low power, low voltage, low frequency applications, and there is a key drive to decrease the power consumption of existing analog domain signal processing and to map more signal processing approaches into the analog domain. This paper presents an analog domain signal processing circuit which approximates the output of the Discrete Wavelet Transform (DWT) for use in ultra low power wearable sensors. Analog filters are used for the DWT filters and it is demonstrated how these generate analog domain DWT-like information that embeds information from Butterworth and Daubechies maximally flat mother wavelet responses. The Analog DWT is realised in hardware via gmC circuits, designed to operate from a 1.3 V coin cell battery, and provide DWT-like signal processing using under 115 nW of power when implemented in a 0.18 μm CMOS process. Practical examples demonstrate the effective use of the new Analog DWT on ECG (electrocardiogram) and EEG (electroencephalogram) signals recorded from humans. PMID:26694414

  9. An Analog Circuit Approximation of the Discrete Wavelet Transform for Ultra Low Power Signal Processing in Wearable Sensor Nodes.

    PubMed

    Casson, Alexander J

    2015-12-17

    Ultra low power signal processing is an essential part of all sensor nodes, and particularly so in emerging wearable sensors for biomedical applications. Analog signal processing has an important role in these low power, low voltage, low frequency applications, and there is a key drive to decrease the power consumption of existing analog domain signal processing and to map more signal processing approaches into the analog domain. This paper presents an analog domain signal processing circuit which approximates the output of the Discrete Wavelet Transform (DWT) for use in ultra low power wearable sensors. Analog filters are used for the DWT filters and it is demonstrated how these generate analog domain DWT-like information that embeds information from Butterworth and Daubechies maximally flat mother wavelet responses. The Analog DWT is realised in hardware via g(m)C circuits, designed to operate from a 1.3 V coin cell battery, and provide DWT-like signal processing using under 115 nW of power when implemented in a 0.18 μm CMOS process. Practical examples demonstrate the effective use of the new Analog DWT on ECG (electrocardiogram) and EEG (electroencephalogram) signals recorded from humans.

  10. Signal and image processing for early detection of coronary artery diseases: A review

    NASA Astrophysics Data System (ADS)

    Mobssite, Youness; Samir, B. Belhaouari; Mohamad Hani, Ahmed Fadzil B.

    2012-09-01

    Today biomedical signals and image based detection are a basic step to diagnose heart diseases, in particular, coronary artery diseases. The goal of this work is to provide non-invasive early detection of Coronary Artery Diseases relying on analyzing images and ECG signals as a combined approach to extract features, further classify and quantify the severity of DCAD by using B-splines method. In an aim of creating a prototype of screening biomedical imaging for coronary arteries to help cardiologists to decide the kind of treatment needed to reduce or control the risk of heart attack.

  11. Material Processing and Design of Biodegradable Metal Matrix Composites for Biomedical Applications.

    PubMed

    Yang, Jingxin; Guo, Jason L; Mikos, Antonios G; He, Chunyan; Cheng, Guang

    2018-06-04

    In recent years, biodegradable metallic materials have played an important role in biomedical applications. However, as typical for the metal materials, their structure, general properties, preparation technology and biocompatibility are hard to change. Furthermore, biodegradable metals are susceptible to excessive degradation and subsequent disruption of their mechanical integrity; this phenomenon limits the utility of these biomaterials. Therefore, the use of degradable metals, as the base material to prepare metal matrix composite materials, it is an excellent alternative to solve the problems above described. Biodegradable metals can thus be successfully combined with other materials to form biodegradable metallic matrix composites for biomedical applications and functions. The present article describes the processing methods currently available to design biodegradable metal matrix composites for biomedical applications and provides an overview of the current existing biodegradable metal systems. At the end, the manuscript presents and discusses the challenges and future research directions for development of biodegradable metallic matrix composites for biomedical purposes.

  12. Sensor, signal, and image informatics - state of the art and current topics.

    PubMed

    Lehmann, T M; Aach, T; Witte, H

    2006-01-01

    The number of articles published annually in the fields of biomedical signal and image acquisition and processing is increasing. Based on selected examples, this survey aims at comprehensively demonstrating the recent trends and developments. Four articles are selected for biomedical data acquisition covering topics such as dose saving in CT, C-arm X-ray imaging systems for volume imaging, and the replacement of dose-intensive CT-based diagnostic with harmonic ultrasound imaging. Regarding biomedical signal analysis (BSA), the four selected articles discuss the equivalence of different time-frequency approaches for signal analysis, an application to Cochlea implants, where time-frequency analysis is applied for controlling the replacement system, recent trends for fusion of different modalities, and the role of BSA as part of a brain machine interfaces. To cover the broad spectrum of publications in the field of biomedical image processing, six papers are focused. Important topics are content-based image retrieval in medical applications, automatic classification of tongue photographs from traditional Chinese medicine, brain perfusion analysis in single photon emission computed tomography (SPECT), model-based visualization of vascular trees, and virtual surgery, where enhanced visualization and haptic feedback techniques are combined with a sphere-filled model of the organ. The selected papers emphasize the five fields forming the chain of biomedical data processing: (1) data acquisition, (2) data reconstruction and pre-processing, (3) data handling, (4) data analysis, and (5) data visualization. Fields 1 and 2 form the sensor informatics, while fields 2 to 5 form signal or image informatics with respect to the nature of the data considered. Biomedical data acquisition and pre-processing, as well as data handling, analysis and visualization aims at providing reliable tools for decision support that improve the quality of health care. Comprehensive evaluation of the

  13. ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.

    PubMed

    Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping

    2018-04-27

    A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.

  14. UMLS content views appropriate for NLP processing of the biomedical literature vs. clinical text.

    PubMed

    Demner-Fushman, Dina; Mork, James G; Shooshan, Sonya E; Aronson, Alan R

    2010-08-01

    Identification of medical terms in free text is a first step in such Natural Language Processing (NLP) tasks as automatic indexing of biomedical literature and extraction of patients' problem lists from the text of clinical notes. Many tools developed to perform these tasks use biomedical knowledge encoded in the Unified Medical Language System (UMLS) Metathesaurus. We continue our exploration of automatic approaches to creation of subsets (UMLS content views) which can support NLP processing of either the biomedical literature or clinical text. We found that suppression of highly ambiguous terms in the conservative AutoFilter content view can partially replace manual filtering for literature applications, and suppression of two character mappings in the same content view achieves 89.5% precision at 78.6% recall for clinical applications. Published by Elsevier Inc.

  15. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  16. Wearable Biomedical Measurement Systems for Assessment of Mental Stress of Combatants in Real Time

    PubMed Central

    Seoane, Fernando; Mohino-Herranz, Inmaculada; Ferreira, Javier; Alvarez, Lorena; Buendia, Ruben; Ayllón, David; Llerena, Cosme; Gil-Pita, Roberto

    2014-01-01

    The Spanish Ministry of Defense, through its Future Combatant program, has sought to develop technology aids with the aim of extending combatants' operational capabilities. Within this framework the ATREC project funded by the “Coincidente” program aims at analyzing diverse biometrics to assess by real time monitoring the stress levels of combatants. This project combines multidisciplinary disciplines and fields, including wearable instrumentation, textile technology, signal processing, pattern recognition and psychological analysis of the obtained information. In this work the ATREC project is described, including the different execution phases, the wearable biomedical measurement systems, the experimental setup, the biomedical signal analysis and speech processing performed. The preliminary results obtained from the data analysis collected during the first phase of the project are presented, indicating the good classification performance exhibited when using features obtained from electrocardiographic recordings and electrical bioimpedance measurements from the thorax. These results suggest that cardiac and respiration activity offer better biomarkers for assessment of stress than speech, galvanic skin response or skin temperature when recorded with wearable biomedical measurement systems. PMID:24759113

  17. Wearable biomedical measurement systems for assessment of mental stress of combatants in real time.

    PubMed

    Seoane, Fernando; Mohino-Herranz, Inmaculada; Ferreira, Javier; Alvarez, Lorena; Buendia, Ruben; Ayllón, David; Llerena, Cosme; Gil-Pita, Roberto

    2014-04-22

    The Spanish Ministry of Defense, through its Future Combatant program, has sought to develop technology aids with the aim of extending combatants' operational capabilities. Within this framework the ATREC project funded by the "Coincidente" program aims at analyzing diverse biometrics to assess by real time monitoring the stress levels of combatants. This project combines multidisciplinary disciplines and fields, including wearable instrumentation, textile technology, signal processing, pattern recognition and psychological analysis of the obtained information. In this work the ATREC project is described, including the different execution phases, the wearable biomedical measurement systems, the experimental setup, the biomedical signal analysis and speech processing performed. The preliminary results obtained from the data analysis collected during the first phase of the project are presented, indicating the good classification performance exhibited when using features obtained from electrocardiographic recordings and electrical bioimpedance measurements from the thorax. These results suggest that cardiac and respiration activity offer better biomarkers for assessment of stress than speech, galvanic skin response or skin temperature when recorded with wearable biomedical measurement systems.

  18. A 1V low power second-order delta-sigma modulator for biomedical signal application.

    PubMed

    Hsu, Chih-Han; Tang, Kea-Tiong

    2013-01-01

    This paper presents the design and implementation of a low-power delta-sigma modulator for biomedical application with a standard 90 nm CMOS technology. The delta-sigma architecture is implemented as 2nd order feedforward architecture. A low quiescent current operational transconductance amplifier (OTA) is utilized to reduce power consumption. This delta-sigma modulator operated in 1V power supply, and achieved 64.87 dB signal to noise distortion ratio (SNDR) at 10 KHz bandwidth with an oversampling ratio (OSR) of 64. The power consumption is 17.14 µW, and the figure-of-merit (FOM) is 0.60 pJ/conv.

  19. Model-based Bayesian filtering of cardiac contaminants from biomedical recordings.

    PubMed

    Sameni, R; Shamsollahi, M B; Jutten, C

    2008-05-01

    Electrocardiogram (ECG) and magnetocardiogram (MCG) signals are among the most considerable sources of noise for other biomedical signals. In some recent works, a Bayesian filtering framework has been proposed for denoising the ECG signals. In this paper, it is shown that this framework may be effectively used for removing cardiac contaminants such as the ECG, MCG and ballistocardiographic artifacts from different biomedical recordings such as the electroencephalogram, electromyogram and also for canceling maternal cardiac signals from fetal ECG/MCG. The proposed method is evaluated on simulated and real signals.

  20. Wireless plataforms for the monitoring of biomedical variables

    NASA Astrophysics Data System (ADS)

    Bianco, Román; Laprovitta, Agustín; Misa, Alberto; Toselli, Eduardo; Castagnola, Juan Luis

    2007-11-01

    The present paper aims to analyze and to compare two wireless platforms for the monitoring of biomedical variables. They must obtain the vital signals of the patients, transmit them through a radio frequency bond and centralize them for their process, storage and monitoring in real time. The implementation of this system permit us to obtain two important benefits; The patient will enjoy greater comfort during the internment, and the doctors will be able to know the state of the biomedical variables of each patient, in simultaneous form. In order to achieve the objective of this work, two communication systems for wireless transmissions data were developed and implemented. The CC1000 transceiver was used in the first system and the Bluetooth module was used in the other system.

  1. A low power biomedical signal processor ASIC based on hardware software codesign.

    PubMed

    Nie, Z D; Wang, L; Chen, W G; Zhang, T; Zhang, Y T

    2009-01-01

    A low power biomedical digital signal processor ASIC based on hardware and software codesign methodology was presented in this paper. The codesign methodology was used to achieve higher system performance and design flexibility. The hardware implementation included a low power 32bit RISC CPU ARM7TDMI, a low power AHB-compatible bus, and a scalable digital co-processor that was optimized for low power Fast Fourier Transform (FFT) calculations. The co-processor could be scaled for 8-point, 16-point and 32-point FFTs, taking approximate 50, 100 and 150 clock circles, respectively. The complete design was intensively simulated using ARM DSM model and was emulated by ARM Versatile platform, before conducted to silicon. The multi-million-gate ASIC was fabricated using SMIC 0.18 microm mixed-signal CMOS 1P6M technology. The die area measures 5,000 microm x 2,350 microm. The power consumption was approximately 3.6 mW at 1.8 V power supply and 1 MHz clock rate. The power consumption for FFT calculations was less than 1.5 % comparing with the conventional embedded software-based solution.

  2. [Scientometrics and bibliometrics of biomedical engineering periodicals and papers].

    PubMed

    Zhao, Ping; Xu, Ping; Li, Bingyan; Wang, Zhengrong

    2003-09-01

    This investigation was made to reveal the current status, research trend and research level of biomedical engineering in Chinese mainland by means of scientometrics and to assess the quality of the four domestic publications by bibliometrics. We identified all articles of four related publications by searching Chinese and foreign databases from 1997 to 2001. All articles collected or cited by these databases were searched and statistically analyzed for finding out the relevant distributions, including databases, years, authors, institutions, subject headings and subheadings. The source of sustentation funds and the related articles were analyzed too. The results showed that two journals were cited by two foreign databases and five Chinese databases simultaneously. The output of Journal of Biomedical Engineering was the highest. Its quantity of original papers cited by EI, CA and the totality of papers sponsored by funds were higher than those of the others, but the quantity and percentage per year of biomedical articles cited by EI were decreased in all. Inland core authors and institutions had come into being in the field of biomedical engineering. Their research topics were mainly concentrated on ten subject headings which included biocompatible materials, computer-assisted signal processing, electrocardiography, computer-assisted image processing, biomechanics, algorithms, electroencephalography, automatic data processing, mechanical stress, hemodynamics, mathematical computing, microcomputers, theoretical models, etc. The main subheadings were concentrated on instrumentation, physiopathology, diagnosis, therapy, ultrasonography, physiology, analysis, surgery, pathology, method, etc.

  3. Teaching biomedical design through a university-industry partnership.

    PubMed

    Khuon, Lunal; Zum, Karl R; Zurn, Jane B; Herrera, Gerald M

    2016-08-01

    This paper describes a course that, as a result of a university-industry partnership, emphasizes bringing industry experts into the classroom to teach biomedical design. Full-time faculty and industry engineers and entrepreneurs teach the senior technical elective course, Biomedical System Design. This hands-on senior course in biomedical system design places varied but connected emphasis on understanding the biological signal source, electronics design, safety, patient use, medical device qualifications, and good manufacturing practices.

  4. Pattern recognition and expert image analysis systems in biomedical image processing (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Oosterlinck, A.; Suetens, P.; Wu, Q.; Baird, M.; F. M., C.

    1987-09-01

    This paper gives an overview of pattern recoanition techniques (P.R.) used in biomedical image processing and problems related to the different P.R. solutions. Also the use of knowledge based systems to overcome P.R. difficulties, is described. This is illustrated by a common example ofabiomedical image processing application.

  5. Solid state light engines for bioanalytical instruments and biomedical devices

    NASA Astrophysics Data System (ADS)

    Jaffe, Claudia B.; Jaffe, Steven M.

    2010-02-01

    Lighting subsystems to drive 21st century bioanalysis and biomedical diagnostics face stringent requirements. Industrywide demands for speed, accuracy and portability mean illumination must be intense as well as spectrally pure, switchable, stable, durable and inexpensive. Ideally a common lighting solution could service these needs for numerous research and clinical applications. While this is a noble objective, the current technology of arc lamps, lasers, LEDs and most recently light pipes have intrinsic spectral and angular traits that make a common solution untenable. Clearly a hybrid solution is required to service the varied needs of the life sciences. Any solution begins with a critical understanding of the instrument architecture and specifications for illumination regarding power, illumination area, illumination and emission wavelengths and numerical aperture. Optimizing signal to noise requires careful optimization of these parameters within the additional constraints of instrument footprint and cost. Often the illumination design process is confined to maximizing signal to noise without the ability to adjust any of the above parameters. A hybrid solution leverages the best of the existing lighting technologies. This paper will review the design process for this highly constrained, but typical optical optimization scenario for numerous bioanalytical instruments and biomedical devices.

  6. Detection and Processing Techniques of FECG Signal for Fetal Monitoring

    PubMed Central

    2009-01-01

    Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912

  7. Hybrid photonic signal processing

    NASA Astrophysics Data System (ADS)

    Ghauri, Farzan Naseer

    This thesis proposes research of novel hybrid photonic signal processing systems in the areas of optical communications, test and measurement, RF signal processing and extreme environment optical sensors. It will be shown that use of innovative hybrid techniques allows design of photonic signal processing systems with superior performance parameters and enhanced capabilities. These applications can be divided into domains of analog-digital hybrid signal processing applications and free-space---fiber-coupled hybrid optical sensors. The analog-digital hybrid signal processing applications include a high-performance analog-digital hybrid MEMS variable optical attenuator that can simultaneously provide high dynamic range as well as high resolution attenuation controls; an analog-digital hybrid MEMS beam profiler that allows high-power watt-level laser beam profiling and also provides both submicron-level high resolution and wide area profiling coverage; and all optical transversal RF filters that operate on the principle of broadband optical spectral control using MEMS and/or Acousto-Optic tunable Filters (AOTF) devices which can provide continuous, digital or hybrid signal time delay and weight selection. The hybrid optical sensors presented in the thesis are extreme environment pressure sensors and dual temperature-pressure sensors. The sensors employ hybrid free-space and fiber-coupled techniques for remotely monitoring a system under simultaneous extremely high temperatures and pressures.

  8. BioLemmatizer: a lemmatization tool for morphological processing of biomedical text

    PubMed Central

    2012-01-01

    Background The wide variety of morphological variants of domain-specific technical terms contributes to the complexity of performing natural language processing of the scientific literature related to molecular biology. For morphological analysis of these texts, lemmatization has been actively applied in the recent biomedical research. Results In this work, we developed a domain-specific lemmatization tool, BioLemmatizer, for the morphological analysis of biomedical literature. The tool focuses on the inflectional morphology of English and is based on the general English lemmatization tool MorphAdorner. The BioLemmatizer is further tailored to the biological domain through incorporation of several published lexical resources. It retrieves lemmas based on the use of a word lexicon, and defines a set of rules that transform a word to a lemma if it is not encountered in the lexicon. An innovative aspect of the BioLemmatizer is the use of a hierarchical strategy for searching the lexicon, which enables the discovery of the correct lemma even if the input Part-of-Speech information is inaccurate. The BioLemmatizer achieves an accuracy of 97.5% in lemmatizing an evaluation set prepared from the CRAFT corpus, a collection of full-text biomedical articles, and an accuracy of 97.6% on the LLL05 corpus. The contribution of the BioLemmatizer to accuracy improvement of a practical information extraction task is further demonstrated when it is used as a component in a biomedical text mining system. Conclusions The BioLemmatizer outperforms other tools when compared with eight existing lemmatizers. The BioLemmatizer is released as an open source software and can be downloaded from http://biolemmatizer.sourceforge.net. PMID:22464129

  9. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  10. Biomedical technology transfer applications of NASA science and technology

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The identification and solution of research and clinical problems in cardiovascular medicine which were investigated by means of biomedical data transfer are reported. The following are sample areas that were focused upon by the Stanford University Biomedical Technology Transfer Team: electrodes for hemiplegia research; vectorcardiogram computer analysis; respiration and phonation electrodes; radiotelemetry of intracranial pressure; and audiotransformation of the electrocardiographic signal. It is concluded that this biomedical technology transfer is significantly aiding present research in cardiovascular medicine.

  11. Digital signal processing the Tevatron BPM signals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cancelo, G.; James, E.; Wolbers, S.

    2005-05-01

    The Beam Position Monitor (TeV BPM) readout system at Fermilab's Tevatron has been updated and is currently being commissioned. The new BPMs use new analog and digital hardware to achieve better beam position measurement resolution. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiproton measurements. The signals provided by the two ends of the BPM pickups are processed by analog band-pass filters and sampled by 14-bit ADCs at 74.3MHz. A crucial part of this work has been the design of digital filters that process the signal. This paper describesmore » the digital processing and estimation techniques used to optimize the beam position measurement. The BPM electronics must operate in narrow-band and wide-band modes to enable measurements of closed-orbit and turn-by-turn positions. The filtering and timing conditions of the signals are tuned accordingly for the operational modes. The analysis and the optimized result for each mode are presented.« less

  12. A review of signals used in sleep analysis

    PubMed Central

    Roebuck, A; Monasterio, V; Gederi, E; Osipov, M; Behar, J; Malhotra, A; Penzel, T; Clifford, GD

    2014-01-01

    This article presents a review of signals used for measuring physiology and activity during sleep and techniques for extracting information from these signals. We examine both clinical needs and biomedical signal processing approaches across a range of sensor types. Issues with recording and analysing the signals are discussed, together with their applicability to various clinical disorders. Both univariate and data fusion (exploiting the diverse characteristics of the primary recorded signals) approaches are discussed, together with a comparison of automated methods for analysing sleep. PMID:24346125

  13. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  14. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  15. Biomedical Biopolymers, their Origin and Evolution in Biomedical Sciences: A Systematic Review

    PubMed Central

    Yadav, Harsh; Shah, Veena Gowri; Shah, Gaurav; Dhaka, Gaurav

    2015-01-01

    Biopolymers provide a plethora of applications in the pharmaceutical and medical applications. A material that can be used for biomedical applications like wound healing, drug delivery and tissue engineering should possess certain properties like biocompatibility, biodegradation to non-toxic products, low antigenicity, high bio-activity, processability to complicated shapes with appropriate porosity, ability to support cell growth and proliferation and appropriate mechanical properties, as well as maintaining mechanical strength. This paper reviews biodegradable biopolymers focusing on their potential in biomedical applications. Biopolymers most commonly used and most abundantly available have been described with focus on the properties relevant to biomedical importance. PMID:26501034

  16. Mining biomedical images towards valuable information retrieval in biomedical and life sciences

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2016-01-01

    Biomedical images are helpful sources for the scientists and practitioners in drawing significant hypotheses, exemplifying approaches and describing experimental results in published biomedical literature. In last decades, there has been an enormous increase in the amount of heterogeneous biomedical image production and publication, which results in a need for bioimaging platforms for feature extraction and analysis of text and content in biomedical images to take advantage in implementing effective information retrieval systems. In this review, we summarize technologies related to data mining of figures. We describe and compare the potential of different approaches in terms of their developmental aspects, used methodologies, produced results, achieved accuracies and limitations. Our comparative conclusions include current challenges for bioimaging software with selective image mining, embedded text extraction and processing of complex natural language queries. PMID:27538578

  17. Intelligent processing of acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Sachse, Wolfgang; Grabec, Igor

    1992-07-01

    Recent developments in applying neural-like signal-processing procedures for analyzing acoustic emission signals are summarized. These procedures employ a set of learning signals to develop a memory that can subsequently be utilized to process other signals to recover information about an unknown source. A majority of the current applications to process ultrasonic waveforms are based on multilayered, feed-forward neural networks, trained with some type of back-error propagation rule.

  18. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  19. Detection of Practice Pattern Trends through Natural Language Processing of Clinical Narratives and Biomedical Literature

    PubMed Central

    Chen, Elizabeth S.; Stetson, Peter D.; Lussier, Yves A.; Markatou, Marianthi; Hripcsak, George; Friedman, Carol

    2007-01-01

    Clinical knowledge, best evidence, and practice patterns evolve over time. The ability to track these changes and study practice trends may be valuable for performance measurement and quality improvement efforts. The goal of this study was to assess the feasibility and validity of methods to generate and compare trends in biomedical literature and clinical narrative. We focused on the challenge of detecting trends in medication usage over time for two diseases: HIV/AIDS and asthma. Information about disease-specific medications in published randomized control trials and discharge summaries at New York-Presbyterian Hospital over a ten-year period were extracted using Natural Language Processing. This paper reports on the ability of our semi-automated process to discover disease-drug practice pattern trends and interpretation of findings across the biomedical and clinical text sources. PMID:18693810

  20. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  1. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  2. Signal Processing and Interpretation Using Multilevel Signal Abstractions.

    DTIC Science & Technology

    1986-06-01

    mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves

  3. Mining biomedical images towards valuable information retrieval in biomedical and life sciences.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2016-01-01

    Biomedical images are helpful sources for the scientists and practitioners in drawing significant hypotheses, exemplifying approaches and describing experimental results in published biomedical literature. In last decades, there has been an enormous increase in the amount of heterogeneous biomedical image production and publication, which results in a need for bioimaging platforms for feature extraction and analysis of text and content in biomedical images to take advantage in implementing effective information retrieval systems. In this review, we summarize technologies related to data mining of figures. We describe and compare the potential of different approaches in terms of their developmental aspects, used methodologies, produced results, achieved accuracies and limitations. Our comparative conclusions include current challenges for bioimaging software with selective image mining, embedded text extraction and processing of complex natural language queries. © The Author(s) 2016. Published by Oxford University Press.

  4. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  5. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  6. An efficient sampling algorithm for uncertain abnormal data detection in biomedical image processing and disease prediction.

    PubMed

    Liu, Fei; Zhang, Xi; Jia, Yan

    2015-01-01

    In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.

  7. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  8. How to locate & hire clinical/biomedical engineers, supervisors, managers & biomedical equipment technicians.

    PubMed

    Pacela, A F; Brush, L C

    1993-01-01

    This article has described the process and the resources available for locating and hiring clinical/biomedical engineers, supervisors, managers, and biomedical equipment technicians. First, the employer must determine the qualifications for the position, including job titles, descriptions, pay scales, and certification requirements. Next, the employer must find qualified applicants. The most common way to do this is to use "outside" contacts, such as help-wanted advertising, specialized job placement agencies, schools and colleges, military resources, regional biomedical societies, and nationwide societies. An "inside" search involves limited internal advertising of the position and using personal referrals for candidates. Finally, the employer must screen the applicants. The position description is the obvious first step in this process, but there are other pre-screening techniques, such as employment testing. Interviewing is the most common way to hire for job positions, but the interviewer needs to know about the position and ask the right questions. Post-interview screening is a final step to help determine the best job-person match.

  9. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  10. Biomedical informatics and translational medicine.

    PubMed

    Sarkar, Indra Neil

    2010-02-26

    Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics) may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records) and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians") can be essential members of translational medicine teams.

  11. Signal processing: opportunities for superconductive circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ralston, R.W.

    1985-03-01

    Prime motivators in the evolution of increasingly sophisticated communication and detection systems are the needs for handling ever wider signal bandwidths and higher data processing speeds. These same needs drive the development of electronic device technology. Until recently the superconductive community has been tightly focused on digital devices for high speed computers. The purpose of this paper is to describe opportunities and challenges which exist for both analog and digital devices in a less familiar area, that of wideband signal processing. The function and purpose of analog signal-processing components, including matched filters, correlators and Fourier transformers, will be described andmore » examples of superconductive implementations given. A canonic signal-processing system is then configured using these components in combination with analog/digital converters and digital output circuits to highlight the important issues of dynamic range, accuracy and equivalent computation rate. Superconductive circuits hold promise for processing signals of 10-GHz bandwidth. Signal processing systems, however, can be properly designed and implemented only through a synergistic combination of the talents of device physicists, circuit designers, algorithm architects and system engineers. An immediate challenge to the applied superconductivity community is to begin sharing ideas with these other researchers.« less

  12. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for

  13. Intelligent Signal Processing for Active Control

    DTIC Science & Technology

    1992-06-17

    FUNDING NUMSI Intelligent Signal Processing for Active Control C-NO001489-J-1633 G. AUTHOR(S) P.A. Ramamoorthy 7. P2RFORMING ORGANIZATION NAME(S) AND...unclassified .unclassified unclassified L . I mu-. W UNIVERSITY OF CINCINNATI COLLEGE OF ENGINEERING Intelligent Signal Processing For Rctiue Control...NAURI RESEARCH Conkact No: NO1489-J-1633 P.L: P.A.imoodh Intelligent Signal Processing For Active Control 1 Executive Summary The thrust of this

  14. Adaptive filtering in biological signal processing.

    PubMed

    Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A

    1990-01-01

    The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.

  15. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Signal propagation in cortical networks: a digital signal processing approach.

    PubMed

    Rodrigues, Francisco Aparecido; da Fontoura Costa, Luciano

    2009-01-01

    This work reports a digital signal processing approach to representing and modeling transmission and combination of signals in cortical networks. The signal dynamics is modeled in terms of diffusion, which allows the information processing undergone between any pair of nodes to be fully characterized in terms of a finite impulse response (FIR) filter. Diffusion without and with time decay are investigated. All filters underlying the cat and macaque cortical organization are found to be of low-pass nature, allowing the cortical signal processing to be summarized in terms of the respective cutoff frequencies (a high cutoff frequency meaning little alteration of signals through their intermixing). Several findings are reported and discussed, including the fact that the incorporation of temporal activity decay tends to provide more diversified cutoff frequencies. Different filtering intensity is observed for each community in those networks. In addition, the brain regions involved in object recognition tend to present the highest cutoff frequencies for both the cat and macaque networks.

  17. Biomedical ultrasonoscope

    NASA Technical Reports Server (NTRS)

    Lee, R. D. (Inventor)

    1979-01-01

    The combination of a "C" mode scan electronics in a portable, battery powered biomedical ultrasonoscope having "A" and "M" mode scan electronics, the latter including a clock generator for generating clock pulses, a cathode ray tube having X, Y and Z axis inputs, a sweep generator connected between the clock generator and the X axis input of the cathode ray tube for generating a cathode ray sweep signal synchronized by the clock pulses, and a receiver adapted to be connected to the Z axis input of the cathode ray tube. The "C" mode scan electronics comprises a plurality of transducer elements arranged in a row and adapted to be positioned on the skin of the patient's body for converting a pulsed electrical signal to a pulsed ultrasonic signal, radiating the ultrasonic signal into the patient's body, picking up the echoes reflected from interfaces in the patient's body and converting the echoes to electrical signals; a plurality of transmitters, each transmitter being coupled to a respective transducer for transmitting a pulsed electrical signal thereto and for transmitting the converted electrical echo signals directly to the receiver, a sequencer connected between the clock generator and the plurality of transmitters and responsive to the clock pulses for firing the transmitters in cyclic order; and a staircase voltage generator connected between the clock generator and the Y axis input of the cathode ray tube for generating a staircase voltage having steps synchronized by the clock pulses.

  18. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Fundamental and applied studies in nanoparticle biomedical imaging, stabilization, and processing

    NASA Astrophysics Data System (ADS)

    Pansare, Vikram J.

    Nanoparticle carrier systems are gaining importance in the rapidly expanding field of biomedical whole animal imaging where they provide long circulating, real time imaging capability. This thesis presents a new paradigm in imaging whereby long wavelength fluorescent or photoacoustically active contrast agents are embedded in the hydrophobic core of nanocarriers formed by Flash NanoPrecipitation. The long wavelength allows for improved optical penetration depth. Compared to traditional contrast agents where fluorophores are placed on the surface, this allows for improved signal, increased stability, and molecular targeting capabilities. Several types of long wavelength hydrophobic dyes based on acene, cyanine, and bacteriochlorin scaffolds are utilized and animal results obtained for nanocarrier systems used in both fluorescent and photoacoustic imaging modes. Photoacoustic imaging is particularly promising due to its high resolution, excellent penetration depth, and ability to provide real-time functional information. Fundamental studies in nanoparticle stabilization are also presented for two systems: model alumina nanoparticles and charge stabilized polystyrene nanoparticles. Motivated by the need for stable suspensions of alumina-based nanocrystals for security printing applications, results are presented for the adsorption of various small molecule charged hydrophobes onto the surface of alumina nanoparticles. Results are also presented for the production of charge stabilized polystyrene nanoparticles via Flash NanoPrecipitation, allowing for the independent control of polymer molecular weight and nanoparticle size, which is not possible by traditional emulsion polymerization routes. Lastly, methods for processing nanoparticle systems are explored. The increasing use of nanoparticle therapeutics in the pharmaceutical industry has necessitated the development of scalable, industrially relevant processing methods. Ultrafiltration is particularly well suited for

  20. Frequency domain laser velocimeter signal processor: A new signal processing scheme

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Clemmons, James I., Jr.

    1987-01-01

    A new scheme for processing signals from laser velocimeter systems is described. The technique utilizes the capabilities of advanced digital electronics to yield a smart instrument that is able to configure itself, based on the characteristics of the input signals, for optimum measurement accuracy. The signal processor is composed of a high-speed 2-bit transient recorder for signal capture and a combination of adaptive digital filters with energy and/or zero crossing detection signal processing. The system is designed to accept signals with frequencies up to 100 MHz with standard deviations up to 20 percent of the average signal frequency. Results from comparative simulation studies indicate measurement accuracies 2.5 times better than with a high-speed burst counter, from signals with as few as 150 photons per burst.

  1. Zinc Oxide Nanomaterials for Biomedical Fluorescence Detection

    PubMed Central

    Hahm, Jong-in

    2014-01-01

    One-dimensional zinc oxide nanomaterials have been recently developed into novel, extremely effective, optical signal-enhancing bioplatforms. Their usefulness has been demonstrated in various biomedical fluorescence assays. Fluorescence is extensively used in biology and medicine as a sensitive and noninvasive detection method for tracking and analyzing biological molecules. Achieving high sensitivity via improving signal-to-noise ratio is of paramount importance in fluorescence-based, trace-level detection. Recent advances in the development of optically superior one-dimensional materials have contributed to this important biomedical area of detection. This review article will discuss major research developments that have so far been made in this emerging and exciting topical field. The discussion will cover a broad range of subjects including synthesis of zinc oxide nanorods (ZnO NRs), various properties differentiating them as suitable optical biodetection platforms, their demonstrated applicability in DNA and protein detection, and the nanomaterial characteristics relevant for biomolecular fluorescence enhancement. This review will then summarize the current status of ZnO NR-based biodetection and further elaborate future utility of ZnO NR platforms for advanced biomedical assays, based on their proven advantages. Lastly, present challenges experienced in this topical area will be identified and focal subject areas for future research will be suggested as well. PMID:24730276

  2. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  3. Customizing cell signaling using engineered genetic logic circuits.

    PubMed

    Wang, Baojun; Buck, Martin

    2012-08-01

    Cells live in an ever-changing environment and continuously sense, process and react to environmental signals using their inherent signaling and gene regulatory networks. Recently, there have been great advances on rewiring the native cell signaling and gene networks to program cells to sense multiple noncognate signals and integrate them in a logical manner before initiating a desired response. Here, we summarize the current state-of-the-art of engineering synthetic genetic logic circuits to customize cellular signaling behaviors, and discuss their promising applications in biocomputing, environmental, biotechnological and biomedical areas as well as the remaining challenges in this growing field. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Software and Algorithms for Biomedical Image Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Lambert, James; Lam, Raymond

    2004-01-01

    A new software equipped with novel image processing algorithms and graphical-user-interface (GUI) tools has been designed for automated analysis and processing of large amounts of biomedical image data. The software, called PlaqTrak, has been specifically used for analysis of plaque on teeth of patients. New algorithms have been developed and implemented to segment teeth of interest from surrounding gum, and a real-time image-based morphing procedure is used to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The PlaqTrak system integrates these components into a single software suite with an easy-to-use GUI (see Figure 1) that allows users to do an end-to-end run of a patient s record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image. The automated and accurate processing of the captured images to segment each tooth [see Figure 2(a)] and then detect plaque on a tooth-by-tooth basis is a critical component of the PlaqTrak system to do clinical trials and analysis with minimal human intervention. These features offer distinct advantages over other competing systems that analyze groups of teeth or synthetic teeth. PlaqTrak divides each segmented tooth into eight regions using an advanced graphics morphing procedure [see results on a chipped tooth in Figure 2(b)], and a pattern recognition classifier is then used to locate plaque [red regions in Figure 2(d)] and enamel regions. The morphing allows analysis within regions of teeth, thereby facilitating detailed statistical analysis such as the amount of plaque present on the biting surfaces on teeth. This software system is applicable to a host of biomedical applications, such as cell analysis and life detection, or robotic applications, such

  5. Signal processing: opportunities for superconductive circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ralston, R.W.

    1985-03-01

    Prime motivators in the evolution of increasingly sophisticated communication and detection systems are the needs for handling ever wider signal bandwidths and higher data-processing speeds. These same needs drive the development of electronic device technology. Until recently the superconductive community has been tightly focused on digital devices for high speed computers. The purpose of this paper is to describe opportunities and challenges which exist for both analog and digital devices in a less familiar area, that of wideband signal processing. The function and purpose of analog signal-processing components, including matched filters, correlators and Fourier transformers, will be described and examplesmore » of superconductive implementations given. A canonic signal-processing system is then configured using these components and digital output circuits to highlight the important issues of dynamic range, accuracy and equivalent computation rate. (Reprints)« less

  6. Disambiguating ambiguous biomedical terms in biomedical narrative text: an unsupervised method.

    PubMed

    Liu, H; Lussier, Y A; Friedman, C

    2001-08-01

    With the growing use of Natural Language Processing (NLP) techniques for information extraction and concept indexing in the biomedical domain, a method that quickly and efficiently assigns the correct sense of an ambiguous biomedical term in a given context is needed concurrently. The current status of word sense disambiguation (WSD) in the biomedical domain is that handcrafted rules are used based on contextual material. The disadvantages of this approach are (i) generating WSD rules manually is a time-consuming and tedious task, (ii) maintenance of rule sets becomes increasingly difficult over time, and (iii) handcrafted rules are often incomplete and perform poorly in new domains comprised of specialized vocabularies and different genres of text. This paper presents a two-phase unsupervised method to build a WSD classifier for an ambiguous biomedical term W. The first phase automatically creates a sense-tagged corpus for W, and the second phase derives a classifier for W using the derived sense-tagged corpus as a training set. A formative experiment was performed, which demonstrated that classifiers trained on the derived sense-tagged corpora achieved an overall accuracy of about 97%, with greater than 90% accuracy for each individual ambiguous term.

  7. Data-Driven Sampling Matrix Boolean Optimization for Energy-Efficient Biomedical Signal Acquisition by Compressive Sensing.

    PubMed

    Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao

    2017-04-01

    Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.

  8. Affective assessment of computer users based on processing the pupil diameter signal.

    PubMed

    Ren, Peng; Barreto, Armando; Gao, Ying; Adjouadi, Malek

    2011-01-01

    Detecting affective changes of computer users is a current challenge in human-computer interaction which is being addressed with the help of biomedical engineering concepts. This article presents a new approach to recognize the affective state ("relaxation" vs. "stress") of a computer user from analysis of his/her pupil diameter variations caused by sympathetic activation. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features are extracted from the preprocessed PD signal for the affective state classification. Finally, a random tree classifier is implemented, achieving an accuracy of 86.78%. In these experiments the Eye Blink Frequency (EBF), is also recorded and used for affective state classification, but the results show that the PD is a more promising physiological signal for affective assessment.

  9. Boronic acid recognition of non-interacting carbohydrates for biomedical applications: increasing fluorescence signals of minimally interacting aldoses and sucralose†

    PubMed Central

    Resendez, Angel; Halim, Md Abdul; Singh, Jasmeet; Webb, Dominic-Luc

    2017-01-01

    To address carbohydrates that are commonly used in biomedical applications with low binding affinities for boronic acid based detection systems, two chemical modification methods were utilized to increase sensitivity. Modified carbohydrates were analyzed using a two component fluorescent probe based on boronic acid-appended viologen–HPTS (4,4′-o-BBV). Carbohydrates normally giving poor signals (fucose, l-rhamnose, xylose) were subjected to sodium borohydride (NaBH4) reduction in ambient conditions for 1 h yielding the corresponding sugar alcohols from fucose, l-rhamnose and xylose in essentially quantitative yields. Compared to original aldoses, apparent binding affinities were increased 4–25-fold. The chlorinated sweetener and colon permeability marker sucralose (Splenda), otherwise undetectable by boronic acids, was dechlorinated to a detectable derivative by reactive oxygen and hydroxide intermediates by the Fenton reaction or by H2O2 and UV light. This method is specific to sucralose as other common sugars, such as sucrose, do not contain any carbon-chlorine bonds. Significant fluorescence response was obtained for chemically modified sucralose with the 4,4′-o-BBV–HPTS probe system. This proof of principle can be applied to biomedical applications, such as gut permeability, malabsorption, etc. PMID:29130464

  10. Boronic acid recognition of non-interacting carbohydrates for biomedical applications: increasing fluorescence signals of minimally interacting aldoses and sucralose.

    PubMed

    Resendez, Angel; Halim, Md Abdul; Singh, Jasmeet; Webb, Dominic-Luc; Singaram, Bakthan

    2017-11-22

    To address carbohydrates that are commonly used in biomedical applications with low binding affinities for boronic acid based detection systems, two chemical modification methods were utilized to increase sensitivity. Modified carbohydrates were analyzed using a two component fluorescent probe based on boronic acid-appended viologen-HPTS (4,4'-o-BBV). Carbohydrates normally giving poor signals (fucose, l-rhamnose, xylose) were subjected to sodium borohydride (NaBH 4 ) reduction in ambient conditions for 1 h yielding the corresponding sugar alcohols from fucose, l-rhamnose and xylose in essentially quantitative yields. Compared to original aldoses, apparent binding affinities were increased 4-25-fold. The chlorinated sweetener and colon permeability marker sucralose (Splenda), otherwise undetectable by boronic acids, was dechlorinated to a detectable derivative by reactive oxygen and hydroxide intermediates by the Fenton reaction or by H 2 O 2 and UV light. This method is specific to sucralose as other common sugars, such as sucrose, do not contain any carbon-chlorine bonds. Significant fluorescence response was obtained for chemically modified sucralose with the 4,4'-o-BBV-HPTS probe system. This proof of principle can be applied to biomedical applications, such as gut permeability, malabsorption, etc.

  11. Signal processing in ultrasound. [for diagnostic medicine

    NASA Technical Reports Server (NTRS)

    Le Croissette, D. H.; Gammell, P. M.

    1978-01-01

    Signal is the term used to denote the characteristic in the time or frequency domain of the probing energy of the system. Processing of this signal in diagnostic ultrasound occurs as the signal travels through the ultrasonic and electrical sections of the apparatus. The paper discusses current signal processing methods, postreception processing, display devices, real-time imaging, and quantitative measurements in noninvasive cardiology. The possibility of using deconvolution in a single transducer system is examined, and some future developments using digital techniques are outlined.

  12. Optical Profilometers Using Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Hall, Gregory A.; Youngquist, Robert; Mikhael, Wasfy

    2006-01-01

    A method of adaptive signal processing has been proposed as the basis of a new generation of interferometric optical profilometers for measuring surfaces. The proposed profilometers would be portable, hand-held units. Sizes could be thus reduced because the adaptive-signal-processing method would make it possible to substitute lower-power coherent light sources (e.g., laser diodes) for white light sources and would eliminate the need for most of the optical components of current white-light profilometers. The adaptive-signal-processing method would make it possible to attain scanning ranges of the order of decimeters in the proposed profilometers.

  13. A Suggested Model for Building Robust Biomedical Implants Registries.

    PubMed

    Aloufi, Bader; Alshagathrah, Fahad; Househ, Mowafa

    2017-01-01

    Registries are an essential source of information for clinical and non-clinical decision-makers; because they provide evidence for post-market clinical follow-up and early detection of safety signals for biomedical implants. Yet, many of todays biomedical implants registries are facing a variety of challenges relating to a poorly designed dataset, the reliability of inputted data and low clinician and patient participation. The purpose of this paper is to present a best practice model for the implementation and use of biomedical implants registries to monitor the safety and effectiveness of implantable medical devices. Based on a literature review and an analysis of multiple national relevant registries, we identified six factors that address contemporary challenges and are believed to be the keys for building a successful biomedical implants registry, which include: sustainable development, international comparability, data reliability, purposeful design, ease of patient participation, and collaborative development at the national level.

  14. siGnum: graphical user interface for EMG signal analysis.

    PubMed

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  15. Monitoring biomedical literature for post-market safety purposes by analyzing networks of text-based coded information.

    PubMed

    Botsis, Taxiarchis; Foster, Matthew; Kreimeyer, Kory; Pandey, Abhishek; Forshee, Richard

    2017-01-01

    Literature review is critical but time-consuming in the post-market surveillance of medical products. We focused on the safety signal of intussusception after the vaccination of infants with the Rotashield Vaccine in 1999 and retrieved all PubMed abstracts for rotavirus vaccines published after January 1, 1998. We used the Event-based Text-mining of Health Electronic Records system, the MetaMap tool, and the National Center for Biomedical Ontologies Annotator to process the abstracts and generate coded terms stamped with the date of publication. Data were analyzed in the Pattern-based and Advanced Network Analyzer for Clinical Evaluation and Assessment to evaluate the intussusception-related findings before and after the release of the new rotavirus vaccines in 2006. The tight connection of intussusception with the historical signal in the first period and the absence of any safety concern for the new vaccines in the second period were verified. We demonstrated the feasibility for semi-automated solutions that may assist medical reviewers in monitoring biomedical literature.

  16. Optical signal processing

    NASA Astrophysics Data System (ADS)

    Vanderlugt, A.

    1993-07-01

    A quasi-realtime adaptive processing system was used to correct the multipath distortion found in wideband digital radios. The measured power spectral density of the input signal was used to adaptively select one of eight equalization filters which reduce the residual distortion to less than 3.6 dB even for the most severe channel distortion. A related adaptive system was used for signal excision in which we removed narrowband interference from wideband signals with minimum signal distortion. An 8x8 acousto-optic switch in a multimode fiber-optic system was built. Insertion loss is approximately 2-4 dB, signal-to-crosstalk ratio is better than 25 dB, and the reconfiguration time is 880 nsec. Short pulses were detected by using the Fresnel transform. Pulses as short as the theoretical limit of 20 nanoseconds were detected for this system, and separated by as little as 60 nanoseconds or by as much as 17 nanoseconds. All possible acousto-optic scanning configurations were considered and classified into four basic types. A consistent set of design relationships for each of the scanning configurations was developed and presented in both tabular and graphic forms from which a preliminary design is obtained.

  17. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically

  18. RASSP signal processing architectures

    NASA Astrophysics Data System (ADS)

    Shirley, Fred; Bassett, Bob; Letellier, J. P.

    1995-06-01

    The rapid prototyping of application specific signal processors (RASSP) program is an ARPA/tri-service effort to dramatically improve the process by which complex digital systems, particularly embedded signal processors, are specified, designed, documented, manufactured, and supported. The domain of embedded signal processing was chosen because it is important to a variety of military and commercial applications as well as for the challenge it presents in terms of complexity and performance demands. The principal effort is being performed by two major contractors, Lockheed Sanders (Nashua, NH) and Martin Marietta (Camden, NJ). For both, improvements in methodology are to be exercised and refined through the performance of individual 'Demonstration' efforts. The Lockheed Sanders' Demonstration effort is to develop an infrared search and track (IRST) processor. In addition, both contractors' results are being measured by a series of externally administered (by Lincoln Labs) six-month Benchmark programs that measure process improvement as a function of time. The first two Benchmark programs are designing and implementing a synthetic aperture radar (SAR) processor. Our demonstration team is using commercially available VME modules from Mercury Computer to assemble a multiprocessor system scalable from one to hundreds of Intel i860 microprocessors. Custom modules for the sensor interface and display driver are also being developed. This system implements either proprietary or Navy owned algorithms to perform the compute-intensive IRST function in real time in an avionics environment. Our Benchmark team is designing custom modules using commercially available processor ship sets, communication submodules, and reconfigurable logic devices. One of the modules contains multiple vector processors optimized for fast Fourier transform processing. Another module is a fiberoptic interface that accepts high-rate input data from the sensors and provides video-rate output data to a

  19. A modular framework for biomedical concept recognition

    PubMed Central

    2013-01-01

    Background Concept recognition is an essential task in biomedical information extraction, presenting several complex and unsolved challenges. The development of such solutions is typically performed in an ad-hoc manner or using general information extraction frameworks, which are not optimized for the biomedical domain and normally require the integration of complex external libraries and/or the development of custom tools. Results This article presents Neji, an open source framework optimized for biomedical concept recognition built around four key characteristics: modularity, scalability, speed, and usability. It integrates modules for biomedical natural language processing, such as sentence splitting, tokenization, lemmatization, part-of-speech tagging, chunking and dependency parsing. Concept recognition is provided through dictionary matching and machine learning with normalization methods. Neji also integrates an innovative concept tree implementation, supporting overlapped concept names and respective disambiguation techniques. The most popular input and output formats, namely Pubmed XML, IeXML, CoNLL and A1, are also supported. On top of the built-in functionalities, developers and researchers can implement new processing modules or pipelines, or use the provided command-line interface tool to build their own solutions, applying the most appropriate techniques to identify heterogeneous biomedical concepts. Neji was evaluated against three gold standard corpora with heterogeneous biomedical concepts (CRAFT, AnEM and NCBI disease corpus), achieving high performance results on named entity recognition (F1-measure for overlap matching: species 95%, cell 92%, cellular components 83%, gene and proteins 76%, chemicals 65%, biological processes and molecular functions 63%, disorders 85%, and anatomical entities 82%) and on entity normalization (F1-measure for overlap name matching and correct identifier included in the returned list of identifiers: species 88

  20. Poisson pre-processing of nonstationary photonic signals: Signals with equality between mean and variance.

    PubMed

    Poplová, Michaela; Sovka, Pavel; Cifra, Michal

    2017-01-01

    Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal.

  1. Poisson pre-processing of nonstationary photonic signals: Signals with equality between mean and variance

    PubMed Central

    Poplová, Michaela; Sovka, Pavel

    2017-01-01

    Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal. PMID:29216207

  2. Packet loss mitigation for biomedical signals in healthcare telemetry.

    PubMed

    Garudadri, Harinath; Baheti, Pawan K

    2009-01-01

    In this work, we propose an effective application layer solution for packet loss mitigation in the context of Body Sensor Networks (BSN) and healthcare telemetry. Packet losses occur due to many reasons including excessive path loss, interference from other wireless systems, handoffs, congestion, system loading, etc. A call for action is in order, as packet losses can have extremely adverse impact on many healthcare applications relying on BAN and WAN technologies. Our approach for packet loss mitigation is based on Compressed Sensing (CS), an emerging signal processing concept, wherein significantly fewer sensor measurements than that suggested by Shannon/Nyquist sampling theorem can be used to recover signals with arbitrarily fine resolution. We present simulation results demonstrating graceful degradation of performance with increasing packet loss rate. We also compare the proposed approach with retransmissions. The CS based packet loss mitigation approach was found to maintain up to 99% beat-detection accuracy at packet loss rates of 20%, with a constant latency of less than 2.5 seconds.

  3. Time Resolved Microfluorescence In Biomedical Diagnosis

    NASA Astrophysics Data System (ADS)

    Schneckenburger, Herbert

    1985-12-01

    A measuring system combining subnanosecond laser-induced fluorescence with microscopic signal detection was installed and used for diverse projects in the biomedical and environmental fields. These projects range from tumor diagnosis and enzymatic analysis to measurements of the activity of methanogenic bacteria, which affect biogas production and waste water cleaning. The advantages of this method and its practical applicability are discussed.

  4. Time-Resolved Microfluorescence In Biomedical Diagnosis

    NASA Astrophysics Data System (ADS)

    Schneckenburger, Herbert

    1985-02-01

    A measuring system combining subnanosecond laser-induced fluorescence with microscopic signal detection was installed and used for diverse projects in the biomedical and environmental field. These projects are ranging from tumor diagnosis and enzymatic analysis to measurements of the activity of methanogenic bacteria which effect biogas production and waste water cleaning. The advantages of this method and its practical applicability are discussed.

  5. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools

    PubMed Central

    2012-01-01

    Background We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Results Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. Conclusions The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications. PMID:22901054

  6. Invariance algorithms for processing NDE signals

    NASA Astrophysics Data System (ADS)

    Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William

    1996-11-01

    Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.

  7. Bistatic SAR: Signal Processing and Image Formation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, Daniel E.; Yocky, David A.

    This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013more » on Kirtland Air Force Base, New Mexico.« less

  8. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  9. Psychoacoustic processing of test signals

    NASA Astrophysics Data System (ADS)

    Kadlec, Frantisek

    2003-10-01

    For the quantitative evaluation of electroacoustic system properties and for psychoacoustic testing it is possible to utilize harmonic signals with fixed frequency, sweeping signals, random signals or their combination. This contribution deals with the design of various test signals with emphasis on audible perception. During the digital generation of signals, some additional undesirable frequency components and noise are produced, which are dependent on signal amplitude and sampling frequency. A mathematical analysis describes the origin of this distortion. By proper selection of signal frequency and amplitude it is possible to minimize those undesirable components. An additional step is to minimize the audible perception of this signal distortion by the application of additional noise (dither). For signals intended for listening tests a dither with triangular or Gaussian probability density function was found to be most effective. Signals modified this way may be further improved by the application of noise shaping, which transposes those undesirable products into frequency regions where they are perceived less, according to psychoacoustic principles. The efficiency of individual processing steps was confirmed both by measurements and by listening tests. [Work supported by the Czech Science Foundation.

  10. Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.

    PubMed

    Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus

    2012-01-02

    Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Process dissociation and mixture signal detection theory.

    PubMed

    DeCarlo, Lawrence T

    2008-11-01

    The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely analyzed study. The results suggest that a process other than recollection may be involved in the process dissociation procedure.

  12. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  13. RysannMD: A biomedical semantic annotator balancing speed and accuracy.

    PubMed

    Cuzzola, John; Jovanović, Jelena; Bagheri, Ebrahim

    2017-07-01

    Recently, both researchers and practitioners have explored the possibility of semantically annotating large and continuously evolving collections of biomedical texts such as research papers, medical reports, and physician notes in order to enable their efficient and effective management and use in clinical practice or research laboratories. Such annotations can be automatically generated by biomedical semantic annotators - tools that are specifically designed for detecting and disambiguating biomedical concepts mentioned in text. The biomedical community has already presented several solid automated semantic annotators. However, the existing tools are either strong in their disambiguation capacity, i.e., the ability to identify the correct biomedical concept for a given piece of text among several candidate concepts, or they excel in their processing time, i.e., work very efficiently, but none of the semantic annotation tools reported in the literature has both of these qualities. In this paper, we present RysannMD (Ryerson Semantic Annotator for Medical Domain), a biomedical semantic annotation tool that strikes a balance between processing time and performance while disambiguating biomedical terms. In other words, RysannMD provides reasonable disambiguation performance when choosing the right sense for a biomedical term in a given context, and does that in a reasonable time. To examine how RysannMD stands with respect to the state of the art biomedical semantic annotators, we have conducted a series of experiments using standard benchmarking corpora, including both gold and silver standards, and four modern biomedical semantic annotators, namely cTAKES, MetaMap, NOBLE Coder, and Neji. The annotators were compared with respect to the quality of the produced annotations measured against gold and silver standards using precision, recall, and F 1 measure and speed, i.e., processing time. In the experiments, RysannMD achieved the best median F 1 measure across the

  14. Process Dissociation and Mixture Signal Detection Theory

    ERIC Educational Resources Information Center

    DeCarlo, Lawrence T.

    2008-01-01

    The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely…

  15. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  16. NASA Biomedical Informatics Capabilities and Needs

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2009-01-01

    To improve on-orbit clinical capabilities by developing and providing operational support for intelligent, robust, reliable, and secure, enterprise-wide and comprehensive health care and biomedical informatics systems with increasing levels of autonomy, for use on Earth, low Earth orbit & exploration class missions. Biomedical Informatics is an emerging discipline that has been defined as the study, invention, and implementation of structures and algorithms to improve communication, understanding and management of medical information. The end objective of biomedical informatics is the coalescing of data, knowledge, and the tools necessary to apply that data and knowledge in the decision-making process, at the time and place that a decision needs to be made.

  17. A survey of signal processing algorithms in brain-computer interfaces based on electrical brain signals.

    PubMed

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K; Birch, Gary E

    2007-06-01

    Brain-computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  18. Empirical data on corpus design and usage in biomedical natural language processing.

    PubMed

    Cohen, K Bretonnel; Fox, Lynne; Ogren, Philip V; Hunter, Lawrence

    2005-01-01

    This paper describes the design of six publicly available biomedical corpora. We then present usage data for the six corpora. We show that corpora that are carefully annotated with respect to structural and linguistic characteristics and that are distributed in standard formats are more widely used than corpora that are not. These findings have implications for the design of the next generation of biomedical corpora.

  19. Piezoelectric extraction of ECG signal

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al

    2016-11-01

    The monitoring and early detection of abnormalities or variations in the cardiac cycle functionality are very critical practices and have significant impact on the prevention of heart diseases and their associated complications. Currently, in the field of biomedical engineering, there is a growing need for devices capable of measuring and monitoring a wide range of cardiac cycle parameters continuously, effectively and on a real-time basis using easily accessible and reusable probes. In this paper, the revolutionary generation and extraction of the corresponding ECG signal using a piezoelectric transducer as alternative for the ECG will be discussed. The piezoelectric transducer pick up the vibrations from the heart beats and convert them into electrical output signals. To this end, piezoelectric and signal processing techniques were employed to extract the ECG corresponding signal from the piezoelectric output voltage signal. The measured electrode based and the extracted piezoelectric based ECG traces are well corroborated. Their peaks amplitudes and locations are well aligned with each other.

  20. Signal processing method and system for noise removal and signal extraction

    DOEpatents

    Fu, Chi Yung; Petrich, Loren

    2009-04-14

    A signal processing method and system combining smooth level wavelet pre-processing together with artificial neural networks all in the wavelet domain for signal denoising and extraction. Upon receiving a signal corrupted with noise, an n-level decomposition of the signal is performed using a discrete wavelet transform to produce a smooth component and a rough component for each decomposition level. The n.sup.th level smooth component is then inputted into a corresponding neural network pre-trained to filter out noise in that component by pattern recognition in the wavelet domain. Additional rough components, beginning at the highest level, may also be retained and inputted into corresponding neural networks pre-trained to filter out noise in those components also by pattern recognition in the wavelet domain. In any case, an inverse discrete wavelet transform is performed on the combined output from all the neural networks to recover a clean signal back in the time domain.

  1. Signal processing methods for MFE plasma diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.

  2. Cubic spline interpolation with overlapped window and data reuse for on-line Hilbert Huang transform biomedical microprocessor.

    PubMed

    Chang, Nai-Fu; Chiang, Cheng-Yi; Chen, Tung-Chien; Chen, Liang-Gee

    2011-01-01

    On-chip implementation of Hilbert-Huang transform (HHT) has great impact to analyze the non-linear and non-stationary biomedical signals on wearable or implantable sensors for the real-time applications. Cubic spline interpolation (CSI) consumes the most computation in HHT, and is the key component for the HHT processor. In tradition, CSI in HHT is usually performed after the collection of a large window of signals, and the long latency violates the realtime requirement of the applications. In this work, we propose to keep processing the incoming signals on-line with small and overlapped data windows without sacrificing the interpolation accuracy. 58% multiplication and 73% division of CSI are saved after the data reuse between the data windows.

  3. Biologically-based signal processing system applied to noise removal for signal extraction

    DOEpatents

    Fu, Chi Yung; Petrich, Loren I.

    2004-07-13

    The method and system described herein use a biologically-based signal processing system for noise removal for signal extraction. A wavelet transform may be used in conjunction with a neural network to imitate a biological system. The neural network may be trained using ideal data derived from physical principles or noiseless signals to determine to remove noise from the signal.

  4. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  5. Biomolecular filters for improved separation of output signals in enzyme logic systems applied to biomedical analysis.

    PubMed

    Halámek, Jan; Zhou, Jian; Halámková, Lenka; Bocharova, Vera; Privman, Vladimir; Wang, Joseph; Katz, Evgeny

    2011-11-15

    Biomolecular logic systems processing biochemical input signals and producing "digital" outputs in the form of YES/NO were developed for analysis of physiological conditions characteristic of liver injury, soft tissue injury, and abdominal trauma. Injury biomarkers were used as input signals for activating the logic systems. Their normal physiological concentrations were defined as logic-0 level, while their pathologically elevated concentrations were defined as logic-1 values. Since the input concentrations applied as logic 0 and 1 values were not sufficiently different, the output signals being at low and high values (0, 1 outputs) were separated with a short gap making their discrimination difficult. Coupled enzymatic reactions functioning as a biomolecular signal processing system with a built-in filter property were developed. The filter process involves a partial back-conversion of the optical-output-signal-yielding product, but only at its low concentrations, thus allowing the proper discrimination between 0 and 1 output values.

  6. Extracting biomedical events from pairs of text entities

    PubMed Central

    2015-01-01

    Background Huge amounts of electronic biomedical documents, such as molecular biology reports or genomic papers are generated daily. Nowadays, these documents are mainly available in the form of unstructured free texts, which require heavy processing for their registration into organized databases. This organization is instrumental for information retrieval, enabling to answer the advanced queries of researchers and practitioners in biology, medicine, and related fields. Hence, the massive data flow calls for efficient automatic methods of text-mining that extract high-level information, such as biomedical events, from biomedical text. The usual computational tools of Natural Language Processing cannot be readily applied to extract these biomedical events, due to the peculiarities of the domain. Indeed, biomedical documents contain highly domain-specific jargon and syntax. These documents also describe distinctive dependencies, making text-mining in molecular biology a specific discipline. Results We address biomedical event extraction as the classification of pairs of text entities into the classes corresponding to event types. The candidate pairs of text entities are recursively provided to a multiclass classifier relying on Support Vector Machines. This recursive process extracts events involving other events as arguments. Compared to joint models based on Markov Random Fields, our model simplifies inference and hence requires shorter training and prediction times along with lower memory capacity. Compared to usual pipeline approaches, our model passes over a complex intermediate problem, while making a more extensive usage of sophisticated joint features between text entities. Our method focuses on the core event extraction of the Genia task of BioNLP challenges yielding the best result reported so far on the 2013 edition. PMID:26201478

  7. Processing Electromyographic Signals to Recognize Words

    NASA Technical Reports Server (NTRS)

    Jorgensen, C. C.; Lee, D. D.

    2009-01-01

    A recently invented speech-recognition method applies to words that are articulated by means of the tongue and throat muscles but are otherwise not voiced or, at most, are spoken sotto voce. This method could satisfy a need for speech recognition under circumstances in which normal audible speech is difficult, poses a hazard, is disturbing to listeners, or compromises privacy. The method could also be used to augment traditional speech recognition by providing an additional source of information about articulator activity. The method can be characterized as intermediate between (1) conventional speech recognition through processing of voice sounds and (2) a method, not yet developed, of processing electroencephalographic signals to extract unspoken words directly from thoughts. This method involves computational processing of digitized electromyographic (EMG) signals from muscle innervation acquired by surface electrodes under a subject's chin near the tongue and on the side of the subject s throat near the larynx. After preprocessing, digitization, and feature extraction, EMG signals are processed by a neural-network pattern classifier, implemented in software, that performs the bulk of the recognition task as described.

  8. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    NASA Astrophysics Data System (ADS)

    Parekh, Ankit

    decomposition technique for an important biomedical signal processing problem: the detection of sleep spindles and K-complexes in human sleep electroencephalography (EEG). We propose a non-linear model for the EEG consisting of three components: (1) a transient (sparse piecewise constant) component, (2) a low-frequency component, and (3) an oscillatory component. The oscillatory component admits a sparse time-frequency representation. Using a convex objective function, we propose a fast non-linear optimization algorithm to estimate the three components in the proposed signal model. The low-frequency and oscillatory components are then used to estimate the K-complexes and sleep spindles respectively. The proposed detection method is shown to outperform several state-of-the-art automated sleep spindles detection methods.

  9. A knowledge-driven approach to biomedical document conceptualization.

    PubMed

    Zheng, Hai-Tao; Borchert, Charles; Jiang, Yong

    2010-06-01

    Biomedical document conceptualization is the process of clustering biomedical documents based on ontology-represented domain knowledge. The result of this process is the representation of the biomedical documents by a set of key concepts and their relationships. Most of clustering methods cluster documents based on invariant domain knowledge. The objective of this work is to develop an effective method to cluster biomedical documents based on various user-specified ontologies, so that users can exploit the concept structures of documents more effectively. We develop a flexible framework to allow users to specify the knowledge bases, in the form of ontologies. Based on the user-specified ontologies, we develop a key concept induction algorithm, which uses latent semantic analysis to identify key concepts and cluster documents. A corpus-related ontology generation algorithm is developed to generate the concept structures of documents. Based on two biomedical datasets, we evaluate the proposed method and five other clustering algorithms. The clustering results of the proposed method outperform the five other algorithms, in terms of key concept identification. With respect to the first biomedical dataset, our method has the F-measure values 0.7294 and 0.5294 based on the MeSH ontology and gene ontology (GO), respectively. With respect to the second biomedical dataset, our method has the F-measure values 0.6751 and 0.6746 based on the MeSH ontology and GO, respectively. Both results outperforms the five other algorithms in terms of F-measure. Based on the MeSH ontology and GO, the generated corpus-related ontologies show informative conceptual structures. The proposed method enables users to specify the domain knowledge to exploit the conceptual structures of biomedical document collections. In addition, the proposed method is able to extract the key concepts and cluster the documents with a relatively high precision. Copyright 2010 Elsevier B.V. All rights reserved.

  10. Signal processing for smart cards

    NASA Astrophysics Data System (ADS)

    Quisquater, Jean-Jacques; Samyde, David

    2003-06-01

    In 1998, Paul Kocher showed that when a smart card computes cryptographic algorithms, for signatures or encryption, its consumption or its radiations leak information. The keys or the secrets hidden in the card can then be recovered using a differential measurement based on the intercorrelation function. A lot of silicon manufacturers use desynchronization countermeasures to defeat power analysis. In this article we detail a new resynchronization technic. This method can be used to facilitate the use of a neural network to do the code recognition. It becomes possible to reverse engineer a software code automatically. Using data and clock separation methods, we show how to optimize the synchronization using signal processing. Then we compare these methods with watermarking methods for 1D and 2D signal. The very last watermarking detection improvements can be applied to signal processing for smart cards with very few modifications. Bayesian processing is one of the best ways to do Differential Power Analysis, and it is possible to extract a PIN code from a smart card in very few samples. So this article shows the need to continue to set up effective countermeasures for cryptographic processors. Although the idea to use advanced signal processing operators has been commonly known for a long time, no publication explains that results can be obtained. The main idea of differential measurement is to use the cross-correlation of two random variables and to repeat consumption measurements on the processor to be analyzed. We use two processors clocked at the same external frequency and computing the same data. The applications of our design are numerous. Two measurements provide the inputs of a central operator. With the most accurate operator we can improve the signal noise ratio, re-synchronize the acquisition clock with the internal one, or remove jitter. The analysis based on consumption or electromagnetic measurements can be improved using our structure. At first sight

  11. Acoustic Signal Processing in Photorefractive Optical Systems.

    NASA Astrophysics Data System (ADS)

    Zhou, Gan

    This thesis discusses applications of the photorefractive effect in the context of acoustic signal processing. The devices and systems presented here illustrate the ideas and optical principles involved in holographic processing of acoustic information. The interest in optical processing stems from the similarities between holographic optical systems and contemporary models for massively parallel computation, in particular, neural networks. An initial step in acoustic processing is the transformation of acoustic signals into relevant optical forms. A fiber-optic transducer with photorefractive readout transforms acoustic signals into optical images corresponding to their short-time spectrum. The device analyzes complex sound signals and interfaces them with conventional optical correlators. The transducer consists of 130 multimode optical fibers sampling the spectral range of 100 Hz to 5 kHz logarithmically. A physical model of the human cochlea can help us understand some characteristics of human acoustic transduction and signal representation. We construct a life-sized cochlear model using elastic membranes coupled with two fluid-filled chambers, and use a photorefractive novelty filter to investigate its response. The detection sensitivity is determined to be 0.3 angstroms per root Hz at 2 kHz. Qualitative agreement is found between the model response and physiological data. Delay lines map time-domain signals into space -domain and permit holographic processing of temporal information. A parallel optical delay line using dynamic beam coupling in a rotating photorefractive crystal is presented. We experimentally demonstrate a 64 channel device with 0.5 seconds of time-delay and 167 Hz bandwidth. Acoustic signal recognition is described in a photorefractive system implementing the time-delay neural network model. The system consists of a photorefractive optical delay-line and a holographic correlator programmed in a LiNbO_3 crystal. We demonstrate the recognition

  12. Measurement of OH, NO, O and N atoms in helium plasma jet for ROS/RNS controlled biomedical processes

    NASA Astrophysics Data System (ADS)

    Yonemori, Seiya; Kamakura, Taku; Ono, Ryo

    2014-10-01

    Atmospheric-pressure plasmas are of emerging interest for new plasma applications such as cancer treatment, cell activation and sterilization. In those biomedical processes, reactive oxygen/nitrogen species (ROS/RNS) are said that they play significant role. It is though that active species give oxidative stress and induce biomedical reactions. In this study, we measured OH, NO, O and N atoms using laser induced fluorescence (LIF) measurement and found that voltage polarity affect particular ROS. When negative high voltage was applied to the plasma jet, O atom density was tripled compared to the case of positive applied voltage. In that case, O atom density was around 3 × 1015 [cm-3] at maximum. In contrast, OH and NO density did not change their density depending on the polarity of applied voltage, measured as in order of 1013 and 1014 [cm-3] at maximum, respectively. From ICCD imaging measurement, it could be seen that negative high voltage enhanced secondary emission in plasma bullet propagation and it can affect the effective production of particular ROS. Since ROS/RNS dose can be a quantitative criterion to control plasma biomedical application, those measurement results is able to be applied for in vivo and in vitro plasma biomedical experiments. This study is supported by the Grant-in-Aid for Science Research by the Ministry of Education, Culture, Sport, Science and Technology.

  13. Digital processing of signals from femtosecond combs

    NASA Astrophysics Data System (ADS)

    Čížek, Martin; Šmíd, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Číp, Ondrej

    2012-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique and with fully digital servo-loop stabilization of the fs comb. Secondly, we are using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset and repetition frequencies of the fs comb.

  14. The University of Connecticut Biomedical Engineering Mentoring Program for high school students.

    PubMed

    Enderle, John D; Liebler, Christopher M; Haapala, Stephenic A; Hart, James L; Thonakkaraparayil, Naomi T; Romonosky, Laura L; Rodriguez, Francisco; Trumbower, Randy D

    2004-01-01

    For the past four years, the Biomedical Engineering Program at the University of Connecticut has offered a summer mentoring program for high school students interested in biomedical engineering. To offer this program, we have partnered with the UConn Mentor Connection Program, the School of Engineering 2000 Program and the College of Liberal Arts and Sciences Summer Laboratory Apprentice Program. We typically have approximately 20-25 high school students learning about biomedical engineering each summer. The mentoring aspect of the program exists at many different levels, with the graduate students mentoring the undergraduate students, and these students mentoring the high school students. The program starts with a three-hour lecture on biomedical engineering to properly orient the students. An in-depth paper on an area in biomedical engineering is a required component, as well as a PowerPoint presentation on their research. All of the students build a device to record an EKG on a computer using LabView, including signal processing to remove noise. The students learn some rudimentary concepts on electrocardiography and the physiology and anatomy of the heart. The students also learn basic electronics and breadboarding circuits, PSpice, the building of a printed circuit board, PIC microcontroller, the operation of Multimeters (including the oscilloscope), soldering, assembly of the EKG device and writing LabView code to run their device on a PC. The students keep their EKG device, LabView program and a fully illustrated booklet on EKG to bring home with them, and hopefully bring back to their high school to share their experiences with other students and teachers. The students also work on several other projects during this summer experience as well as visit Hartford Hospital to learn about Clinical Engineering.

  15. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that

  16. Signal processing of anthropometric data

    NASA Astrophysics Data System (ADS)

    Zimmermann, W. J.

    1983-09-01

    The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.

  17. Signal processing of anthropometric data

    NASA Technical Reports Server (NTRS)

    Zimmermann, W. J.

    1983-01-01

    The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.

  18. Multichannel heterodyning for wideband interferometry, correlation and signal processing

    DOEpatents

    Erskine, David J.

    1999-01-01

    A method of signal processing a high bandwidth signal by coherently subdividing it into many narrow bandwidth channels which are individually processed at lower frequencies in a parallel manner. Autocorrelation and correlations can be performed using reference frequencies which may drift slowly with time, reducing cost of device. Coordinated adjustment of channel phases alters temporal and spectral behavior of net signal process more precisely than a channel used individually. This is a method of implementing precision long coherent delays, interferometers, and filters for high bandwidth optical or microwave signals using low bandwidth electronics. High bandwidth signals can be recorded, mathematically manipulated, and synthesized.

  19. Is complex signal processing for bone conduction hearing aids useful?

    PubMed

    Kompis, Martin; Kurz, Anja; Pfiffner, Flurin; Senn, Pascal; Arnold, Andreas; Caversaccio, Marco

    2014-05-01

    To establish whether complex signal processing is beneficial for users of bone anchored hearing aids. Review and analysis of two studies from our own group, each comparing a speech processor with basic digital signal processing (either Baha Divino or Baha Intenso) and a processor with complex digital signal processing (either Baha BP100 or Baha BP110 power). The main differences between basic and complex signal processing are the number of audiologist accessible frequency channels and the availability and complexity of the directional multi-microphone noise reduction and loudness compression systems. Both studies show a small, statistically non-significant improvement of speech understanding in quiet with the complex digital signal processing. The average improvement for speech in noise is +0.9 dB, if speech and noise are emitted both from the front of the listener. If noise is emitted from the rear and speech from the front of the listener, the advantage of the devices with complex digital signal processing as opposed to those with basic signal processing increases, on average, to +3.2 dB (range +2.3 … +5.1 dB, p ≤ 0.0032). Complex digital signal processing does indeed improve speech understanding, especially in noise coming from the rear. This finding has been supported by another study, which has been published recently by a different research group. When compared to basic digital signal processing, complex digital signal processing can increase speech understanding of users of bone anchored hearing aids. The benefit is most significant for speech understanding in noise.

  20. Digital fabrication of multi-material biomedical objects.

    PubMed

    Cheung, H H; Choi, S H

    2009-12-01

    This paper describes a multi-material virtual prototyping (MMVP) system for modelling and digital fabrication of discrete and functionally graded multi-material objects for biomedical applications. The MMVP system consists of a DMMVP module, an FGMVP module and a virtual reality (VR) simulation module. The DMMVP module is used to model discrete multi-material (DMM) objects, while the FGMVP module is for functionally graded multi-material (FGM) objects. The VR simulation module integrates these two modules to perform digital fabrication of multi-material objects, which can be subsequently visualized and analysed in a virtual environment to optimize MMLM processes for fabrication of product prototypes. Using the MMVP system, two biomedical objects, including a DMM human spine and an FGM intervertebral disc spacer are modelled and digitally fabricated for visualization and analysis in a VR environment. These studies show that the MMVP system is a practical tool for modelling, visualization, and subsequent fabrication of biomedical objects of discrete and functionally graded multi-materials for biomedical applications. The system may be adapted to control MMLM machines with appropriate hardware for physical fabrication of biomedical objects.

  1. Parallel Processing of Broad-Band PPM Signals

    NASA Technical Reports Server (NTRS)

    Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement

    2010-01-01

    A parallel-processing algorithm and a hardware architecture to implement the algorithm have been devised for timeslot synchronization in the reception of pulse-position-modulated (PPM) optical or radio signals. As in the cases of some prior algorithms and architectures for parallel, discrete-time, digital processing of signals other than PPM, an incoming broadband signal is divided into multiple parallel narrower-band signals by means of sub-sampling and filtering. The number of parallel streams is chosen so that the frequency content of the narrower-band signals is low enough to enable processing by relatively-low speed complementary metal oxide semiconductor (CMOS) electronic circuitry. The algorithm and architecture are intended to satisfy requirements for time-varying time-slot synchronization and post-detection filtering, with correction of timing errors independent of estimation of timing errors. They are also intended to afford flexibility for dynamic reconfiguration and upgrading. The architecture is implemented in a reconfigurable CMOS processor in the form of a field-programmable gate array. The algorithm and its hardware implementation incorporate three separate time-varying filter banks for three distinct functions: correction of sub-sample timing errors, post-detection filtering, and post-detection estimation of timing errors. The design of the filter bank for correction of timing errors, the method of estimating timing errors, and the design of a feedback-loop filter are governed by a host of parameters, the most critical one, with regard to processing very broadband signals with CMOS hardware, being the number of parallel streams (equivalently, the rate-reduction parameter).

  2. Signal processing for distributed sensor concept: DISCO

    NASA Astrophysics Data System (ADS)

    Rafailov, Michael K.

    2007-04-01

    Distributed Sensor concept - DISCO proposed for multiplication of individual sensor capabilities through cooperative target engagement. DISCO relies on ability of signal processing software to format, to process and to transmit and receive sensor data and to exploit those data in signal synthesis process. Each sensor data is synchronized formatted, Signal-to-Noise Ration (SNR) enhanced and distributed inside of the sensor network. Signal processing technique for DISCO is Recursive Adaptive Frame Integration of Limited data - RAFIL technique that was initially proposed [1] as a way to improve the SNR, reduce data rate and mitigate FPA correlated noise of an individual sensor digital video-signal processing. In Distributed Sensor Concept RAFIL technique is used in segmented way, when constituencies of the technique are spatially and/or temporally separated between transmitters and receivers. Those constituencies include though not limited to two thresholds - one is tuned for optimum probability of detection, the other - to manage required false alarm rate, and limited frame integration placed somewhere between the thresholds as well as formatters, conventional integrators and more. RAFIL allows a non-linear integration that, along with SNR gain, provides system designers more capability where cost, weight, or power considerations limit system data rate, processing, or memory capability [2]. DISCO architecture allows flexible optimization of SNR gain, data rates and noise suppression on sensor's side and limited integration, re-formatting and final threshold on node's side. DISCO with Recursive Adaptive Frame Integration of Limited data may have flexible architecture that allows segmenting the hardware and software to be best suitable for specific DISCO applications and sensing needs - whatever it is air-or-space platforms, ground terminals or integration of sensors network.

  3. Ridge extraction from the time-frequency representation (TFR) of signals based on an image processing approach: application to the analysis of uterine electromyogram AR TFR.

    PubMed

    Terrien, Jérémy; Marque, Catherine; Germain, Guy

    2008-05-01

    Time-frequency representations (TFRs) of signals are increasingly being used in biomedical research. Analysis of such representations is sometimes difficult, however, and is often reduced to the extraction of ridges, or local energy maxima. In this paper, we describe a new ridge extraction method based on the image processing technique of active contours or snakes. We have tested our method on several synthetic signals and for the analysis of uterine electromyogram or electrohysterogram (EHG) recorded during gestation in monkeys. We have also evaluated a postprocessing algorithm that is especially suited for EHG analysis. Parameters are evaluated on real EHG signals in different gestational periods. The presented method gives good results when applied to synthetic as well as EHG signals. We have been able to obtain smaller ridge extraction errors when compared to two other methods specially developed for EHG. The gradient vector flow (GVF) snake method, or GVF-snake method, appears to be a good ridge extraction tool, which could be used on TFR of mono or multicomponent signals with good results.

  4. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    NASA Astrophysics Data System (ADS)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  5. Asynchronous signal-dependent non-uniform sampler

    NASA Astrophysics Data System (ADS)

    Can-Cimino, Azime; Chaparro, Luis F.; Sejdić, Ervin

    2014-05-01

    Analog sparse signals resulting from biomedical and sensing network applications are typically non-stationary with frequency-varying spectra. By ignoring that the maximum frequency of their spectra is changing, uniform sampling of sparse signals collects unnecessary samples in quiescent segments of the signal. A more appropriate sampling approach would be signal-dependent. Moreover, in many of these applications power consumption and analog processing are issues of great importance that need to be considered. In this paper we present a signal dependent non-uniform sampler that uses a Modified Asynchronous Sigma Delta Modulator which consumes low-power and can be processed using analog procedures. Using Prolate Spheroidal Wave Functions (PSWF) interpolation of the original signal is performed, thus giving an asynchronous analog to digital and digital to analog conversion. Stable solutions are obtained by using modulated PSWFs functions. The advantage of the adapted asynchronous sampler is that range of frequencies of the sparse signal is taken into account avoiding aliasing. Moreover, it requires saving only the zero-crossing times of the non-uniform samples, or their differences, and the reconstruction can be done using their quantized values and a PSWF-based interpolation. The range of frequencies analyzed can be changed and the sampler can be implemented as a bank of filters for unknown range of frequencies. The performance of the proposed algorithm is illustrated with an electroencephalogram (EEG) signal.

  6. A review of biocompatible metal injection moulding process parameters for biomedical applications.

    PubMed

    Hamidi, M F F A; Harun, W S W; Samykano, M; Ghani, S A C; Ghazalli, Z; Ahmad, F; Sulong, A B

    2017-09-01

    Biocompatible metals have been revolutionizing the biomedical field, predominantly in human implant applications, where these metals widely used as a substitute to or as function restoration of degenerated tissues or organs. Powder metallurgy techniques, in specific the metal injection moulding (MIM) process, have been employed for the fabrication of controlled porous structures used for dental and orthopaedic surgical implants. The porous metal implant allows bony tissue ingrowth on the implant surface, thereby enhancing fixation and recovery. This paper elaborates a systematic classification of various biocompatible metals from the aspect of MIM process as used in medical industries. In this study, three biocompatible metals are reviewed-stainless steels, cobalt alloys, and titanium alloys. The applications of MIM technology in biomedicine focusing primarily on the MIM process setting parameters discussed thoroughly. This paper should be of value to investigators who are interested in state of the art of metal powder metallurgy, particularly the MIM technology for biocompatible metal implant design and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Multichannel heterodyning for wideband interferometry, correlation and signal processing

    DOEpatents

    Erskine, D.J.

    1999-08-24

    A method is disclosed of signal processing a high bandwidth signal by coherently subdividing it into many narrow bandwidth channels which are individually processed at lower frequencies in a parallel manner. Autocorrelation and correlations can be performed using reference frequencies which may drift slowly with time, reducing cost of device. Coordinated adjustment of channel phases alters temporal and spectral behavior of net signal process more precisely than a channel used individually. This is a method of implementing precision long coherent delays, interferometers, and filters for high bandwidth optical or microwave signals using low bandwidth electronics. High bandwidth signals can be recorded, mathematically manipulated, and synthesized. 50 figs.

  8. Hot topics: Signal processing in acoustics

    NASA Astrophysics Data System (ADS)

    Gaumond, Charles F.

    2005-09-01

    Signal processing in acoustics is a multidisciplinary group of people that work in many areas of acoustics. We have chosen two areas that have shown exciting new applications of signal processing to acoustics or have shown exciting and important results from the use of signal processing. In this session, two hot topics are shown: the use of noiselike acoustic fields to determine sound propagation structure and the use of localization to determine animal behaviors. The first topic shows the application of correlation on geo-acoustic fields to determine the Greens function for propagation through the Earth. These results can then be further used to solve geo-acoustic inverse problems. The first topic also shows the application of correlation using oceanic noise fields to determine the Greens function through the ocean. These results also have utility for oceanic inverse problems. The second topic shows exciting results from the detection, localization, and tracking of marine mammals by two different groups. Results from detection and localization of bullfrogs are shown, too. Each of these studies contributed to the knowledge of animal behavior. [Work supported by ONR.

  9. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  10. Text Mining in Biomedical Domain with Emphasis on Document Clustering

    PubMed Central

    2017-01-01

    Objectives With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. Methods This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Results Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Conclusions Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise. PMID:28875048

  11. Text Mining in Biomedical Domain with Emphasis on Document Clustering.

    PubMed

    Renganathan, Vinaitheerthan

    2017-07-01

    With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise.

  12. Strengthening of biomedical Ni-free Co-Cr-Mo alloy by multipass "low-strain-per-pass" thermomechanical processing.

    PubMed

    Mori, Manami; Yamanaka, Kenta; Sato, Shigeo; Tsubaki, Shinki; Satoh, Kozue; Kumagai, Masayoshi; Imafuku, Muneyuki; Shobu, Takahisa; Chiba, Akihiko

    2015-12-01

    Further strengthening of biomedical Co-Cr-Mo alloys is desired, owing to the demand for improvements to their durability in applications such as artificial hip joints, spinal rods, bone plates, and screws. Here, we present a strategy-multipass "low-strain-per-pass" thermomechanical processing-for achieving high-strength biomedical Co-Cr-Mo alloys with sufficient ductility. The process primarily consists of multipass hot deformation, which involves repeated introduction of relatively small amounts of strain to the alloy at elevated temperatures. The concept was verified by performing hot rolling of a Co-28 Cr-6 Mo-0.13N (mass%) alloy and its strengthening mechanisms were examined. Strength increased monotonically with hot-rolling reduction, eventually reaching 1,400 MPa in 0.2% proof stress, an exceptionally high value. Synchrotron X-ray diffraction (XRD) line-profile analysis revealed a drastic increase in the dislocation density with an increase in hot-rolling reduction and proposed that the significant strengthening was primarily driven by the increased dislocation density, while the contributions of grain refinement were minor. In addition, extra strengthening, which originates from contributions of planar defects (stacking faults/deformation twins), became apparent for greater hot-rolling reductions. The results obtained in this work help in reconsidering the existing strengthening strategy for the alloys, and thus, a novel feasible manufacturing route using conventional hot deformation processing, such as forging, rolling, swaging, and drawing, is realized. The results obtained in this work suggested a novel microstructural design concept/feasible manufacturing route of high-strength Co-Cr-Mo alloys using conventional hot deformation processing. The present strategy focuses on the strengthening due to the introduction of a high density of lattice defects rather than grain refinement using dynamic recrystallization (DRX). The hot-rolled samples obtained by our

  13. SAR processing using SHARC signal processing systems

    NASA Astrophysics Data System (ADS)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  14. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    NASA Astrophysics Data System (ADS)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  15. Signal Processing Methods Monitor Cranial Pressure

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  16. User's manual SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1983-10-25

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Many of the basic operations one would perform on digitized data are contained in the core SIG package. Out of these core commands, more powerful signal processing algorithms may be built. Many different operations on time- and frequency-domain signals can be performed by SIG. They include operations on the samples of a signal, such as adding a scalar tomore » each sample, operations on the entire signal such as digital filtering, and operations on two or more signals such as adding two signals. Signals may be simulated, such as a pulse train or a random waveform. Graphics operations display signals and spectra.« less

  17. Active substrates improving sensitivity in biomedical fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Le Moal, E.; Leveque-Fort, S.; Fort, E.; Lacharme, J.-P.; Fontaine-Aupart, M.-P.; Ricolleau, C.

    2005-08-01

    Fluorescence is widely used as a spectroscopic tool or for biomedical imaging, in particular for DNA chips. In some cases, detection of very low molecular concentrations and precise localization of biomarkers are limited by the weakness of the fluorescence signal. We present a new method based on sample substrates that improve fluorescence detection sensitivity. These active substrates consist in glass slides covered with metal (gold or silver) and dielectric (alumina) films and can directly be used with common microscope set-up. Fluorescence enhancement affects both excitation and decay rates and is strongly dependant on the distance to the metal surface. Furthermore, fluorescence collection is improved since fluorophore emission lobes are advantageously modified close to a reflective surface. Finally, additional improvements are achieved by structuring the metallic layer. Substrates morphology was mapped by Atomic Force Microscopy (AFM). Substrates optical properties were studied using mono- and bi-photonic fluorescence microscopy with time resolution. An original set-up was implemented for spatial radiation pattern's measurement. Detection improvement was then tested on commercial devices. Several biomedical applications are presented. Enhancement by two orders of magnitude are achieved for DNA chips and signal-to-noise ratio is greatly increased for cells imaging.

  18. Book: Marine Bioacoustic Signal Processing and Analysis

    DTIC Science & Technology

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  19. System for monitoring non-coincident, nonstationary process signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.

    2005-01-04

    An improved system for monitoring non-coincident, non-stationary, process signals. The mean, variance, and length of a reference signal is defined by an automated system, followed by the identification of the leading and falling edges of a monitored signal and the length of the monitored signal. The monitored signal is compared to the reference signal, and the monitored signal is resampled in accordance with the reference signal. The reference signal is then correlated with the resampled monitored signal such that the reference signal and the resampled monitored signal are coincident in time with each other. The resampled monitored signal is then compared to the reference signal to determine whether the resampled monitored signal is within a set of predesignated operating conditions.

  20. Navigating the Path to a Biomedical Science Career

    NASA Astrophysics Data System (ADS)

    Zimmerman, Andrea McNeely

    The number of biomedical PhD scientists being trained and graduated far exceeds the number of academic faculty positions and academic research jobs. If this trend is compelling biomedical PhD scientists to increasingly seek career paths outside of academia, then more should be known about their intentions, desires, training experiences, and career path navigation. Therefore, the purpose of this study was to understand the process through which biomedical PhD scientists are trained and supported for navigating future career paths. In addition, the study sought to determine whether career development support efforts and opportunities should be redesigned to account for the proportion of PhD scientists following non-academic career pathways. Guided by the social cognitive career theory (SCCT) framework this study sought to answer the following central research question: How does a southeastern tier 1 research university train and support its biomedical PhD scientists for navigating their career paths? Key findings are: Many factors influence PhD scientists' career sector preference and job search process, but the most influential were relationships with faculty, particularly the mentor advisor; Planned activities are a significant aspect of the training process and provide skills for career success; and Planned activities provided skills necessary for a career, but influential factors directed the career path navigated. Implications for practice and future research are discussed.

  1. Modeling laser velocimeter signals as triply stochastic Poisson processes

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  2. Contemporary ultrasonic signal processing approaches for nondestructive evaluation of multilayered structures

    NASA Astrophysics Data System (ADS)

    Zhang, Guang-Ming; Harvey, David M.

    2012-03-01

    Various signal processing techniques have been used for the enhancement of defect detection and defect characterisation. Cross-correlation, filtering, autoregressive analysis, deconvolution, neural network, wavelet transform and sparse signal representations have all been applied in attempts to analyse ultrasonic signals. In ultrasonic nondestructive evaluation (NDE) applications, a large number of materials have multilayered structures. NDE of multilayered structures leads to some specific problems, such as penetration, echo overlap, high attenuation and low signal-to-noise ratio. The signals recorded from a multilayered structure are a class of very special signals comprised of limited echoes. Such signals can be assumed to have a sparse representation in a proper signal dictionary. Recently, a number of digital signal processing techniques have been developed by exploiting the sparse constraint. This paper presents a review of research to date, showing the up-to-date developments of signal processing techniques made in ultrasonic NDE. A few typical ultrasonic signal processing techniques used for NDE of multilayered structures are elaborated. The practical applications and limitations of different signal processing methods in ultrasonic NDE of multilayered structures are analysed.

  3. Low cost open data acquisition system for biomedical applications

    NASA Astrophysics Data System (ADS)

    Zabolotny, Wojciech M.; Laniewski-Wollk, Przemyslaw; Zaworski, Wojciech

    2005-09-01

    In the biomedical applications it is often necessary to collect measurement data from different devices. It is relatively easy, if the devices are equipped with a MIB or Ethernet interface, however often they feature only the asynchronous serial link, and sometimes the measured values are available only as the analog signals. The system presented in the paper is a low cost alternative to commercially available data acquisition systems. The hardware and software architecture of the system is fully open, so it is possible to customize it for particular needs. The presented system offers various possibilities to connect it to the computer based data processing unit - e.g. using the USB or Ethernet ports. Both interfaces allow also to use many such systems in parallel to increase amount of serial and analog inputs. The open source software used in the system makes possible to process the acquired data with standard tools like MATLAB, Scilab or Octave, or with a dedicated, user supplied application.

  4. Segregation of biomedical waste in an South Indian tertiary care hospital.

    PubMed

    Sengodan, Vetrivel Chezian

    2014-07-01

    Hospital wastes pose significant public health hazard if not properly managed. Hence, it is necessary to develop and adopt optimal waste management systems in the hospitals. Biomedical waste generated in Coimbatore Medical College Hospital was color coded (blue, yellow, and red) and the data was analyzed retrospectively on a daily basis for 3 years (January 2010-December 2012). Effective segregation protocols significantly reduced biomedical waste generated from 2011 to 2012. While biomedical waste of red category was significantly higher (>50%), the category yellow was the least. Per unit (per bed per day) total biomedical waste generated was 68.5, 68.8, and 61.3 grams in 2010, 2011, and 2012, respectively. Segregation of biomedical waste at the source of generation is the first and essential step in biomedical waste management. Continuous training, fixing the responsibility on the nursing persons, and constant supervision are the key criteria's in implementing biomedical waste segregation process, which can significantly reduce per unit biomedical waste generated. We highly recommend all hospitals to adopt our protocol and effectively implement them to reduce generation of biomedical waste.

  5. A high performance biometric signal and image processing method to reveal blood perfusion towards 3D oxygen saturation mapping

    NASA Astrophysics Data System (ADS)

    Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron

    2014-03-01

    Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.

  6. Decoding Signal Processing at the Single-Cell Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiley, H. Steven

    The ability of cells to detect and decode information about their extracellular environment is critical to generating an appropriate response. In multicellular organisms, cells must decode dozens of signals from their neighbors and extracellular matrix to maintain tissue homeostasis while still responding to environmental stressors. How cells detect and process information from their surroundings through a surprisingly limited number of signal transduction pathways is one of the most important question in biology. Despite many decades of research, many of the fundamental principles that underlie cell signal processing remain obscure. However, in this issue of Cell Systems, Gillies et al presentmore » compelling evidence that the early response gene circuit can act as a linear signal integrator, thus providing significant insight into how cells handle fluctuating signals and noise in their environment.« less

  7. Supervised Learning Based Hypothesis Generation from Biomedical Literature.

    PubMed

    Sang, Shengtian; Yang, Zhihao; Li, Zongyao; Lin, Hongfei

    2015-01-01

    Nowadays, the amount of biomedical literatures is growing at an explosive speed, and there is much useful knowledge undiscovered in this literature. Researchers can form biomedical hypotheses through mining these works. In this paper, we propose a supervised learning based approach to generate hypotheses from biomedical literature. This approach splits the traditional processing of hypothesis generation with classic ABC model into AB model and BC model which are constructed with supervised learning method. Compared with the concept cooccurrence and grammar engineering-based approaches like SemRep, machine learning based models usually can achieve better performance in information extraction (IE) from texts. Then through combining the two models, the approach reconstructs the ABC model and generates biomedical hypotheses from literature. The experimental results on the three classic Swanson hypotheses show that our approach outperforms SemRep system.

  8. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  9. Removing Background Noise with Phased Array Signal Processing

    NASA Technical Reports Server (NTRS)

    Podboy, Gary; Stephens, David

    2015-01-01

    Preliminary results are presented from a test conducted to determine how well microphone phased array processing software could pull an acoustic signal out of background noise. The array consisted of 24 microphones in an aerodynamic fairing designed to be mounted in-flow. The processing was conducted using Functional Beam forming software developed by Optinav combined with cross spectral matrix subtraction. The test was conducted in the free-jet of the Nozzle Acoustic Test Rig at NASA GRC. The background noise was produced by the interaction of the free-jet flow with the solid surfaces in the flow. The acoustic signals were produced by acoustic drivers. The results show that the phased array processing was able to pull the acoustic signal out of the background noise provided the signal was no more than 20 dB below the background noise level measured using a conventional single microphone equipped with an aerodynamic forebody.

  10. Processing oscillatory signals by incoherent feedforward loops

    NASA Astrophysics Data System (ADS)

    Zhang, Carolyn; Wu, Feilun; Tsoi, Ryan; Shats, Igor; You, Lingchong

    From the timing of amoeba development to the maintenance of stem cell pluripotency,many biological signaling pathways exhibit the ability to differentiate between pulsatile and sustained signals in the regulation of downstream gene expression.While networks underlying this signal decoding are diverse,many are built around a common motif, the incoherent feedforward loop (IFFL),where an input simultaneously activates an output and an inhibitor of the output.With appropriate parameters,this motif can generate temporal adaptation,where the system is desensitized to a sustained input.This property serves as the foundation for distinguishing signals with varying temporal profiles.Here,we use quantitative modeling to examine another property of IFFLs,the ability to process oscillatory signals.Our results indicate that the system's ability to translate pulsatile dynamics is limited by two constraints.The kinetics of IFFL components dictate the input range for which the network can decode pulsatile dynamics.In addition,a match between the network parameters and signal characteristics is required for optimal ``counting''.We elucidate one potential mechanism by which information processing occurs in natural networks with implications in the design of synthetic gene circuits for this purpose. This work was partially supported by the National Science Foundation Graduate Research Fellowship (CZ).

  11. Processing Oscillatory Signals by Incoherent Feedforward Loops

    PubMed Central

    Zhang, Carolyn; You, Lingchong

    2016-01-01

    From the timing of amoeba development to the maintenance of stem cell pluripotency, many biological signaling pathways exhibit the ability to differentiate between pulsatile and sustained signals in the regulation of downstream gene expression. While the networks underlying this signal decoding are diverse, many are built around a common motif, the incoherent feedforward loop (IFFL), where an input simultaneously activates an output and an inhibitor of the output. With appropriate parameters, this motif can exhibit temporal adaptation, where the system is desensitized to a sustained input. This property serves as the foundation for distinguishing input signals with varying temporal profiles. Here, we use quantitative modeling to examine another property of IFFLs—the ability to process oscillatory signals. Our results indicate that the system’s ability to translate pulsatile dynamics is limited by two constraints. The kinetics of the IFFL components dictate the input range for which the network is able to decode pulsatile dynamics. In addition, a match between the network parameters and input signal characteristics is required for optimal “counting”. We elucidate one potential mechanism by which information processing occurs in natural networks, and our work has implications in the design of synthetic gene circuits for this purpose. PMID:27623175

  12. Community outreach at biomedical research facilities.

    PubMed

    Goldman, M; Hedetniemi, J N; Herbert, E R; Sassaman, J S; Walker, B C

    2000-12-01

    For biomedical researchers to fulfill their responsibility for protecting the environment, they must do more than meet the scientific challenge of reducing the number and volume of hazardous materials used in their laboratories and the engineering challenge of reducing pollution and shifting to cleaner energy sources. They must also meet the public relations challenge of informing and involving their neighbors in these efforts. The experience of the Office of Community Liaison of the National Institutes of Health (NIH) in meeting the latter challenge offers a model and several valuable lessons for other biomedical research facilities to follow. This paper is based on presentations by an expert panel during the Leadership Conference on Biomedical Research and the Environment held 1--2 November 1999 at NIH, Bethesda, Maryland. The risks perceived by community members are often quite different from those identified by officials at the biomedical research facility. The best antidote for misconceptions is more and better information. If community organizations are to be informed participants in the decision-making process, they need a simple but robust mechanism for identifying and evaluating the environmental hazards in their community. Local government can and should be an active and fully informed partner in planning and emergency preparedness. In some cases this can reduce the regulatory burden on the biomedical research facility. In other cases it might simplify and expedite the permitting process or help the facility disseminate reliable information to the community. When a particular risk, real or perceived, is of special concern to the community, community members should be involved in the design, implementation, and evaluation of targeted risk assessment activities. Only by doing so will the community have confidence in the results of those activities. NIH has involved community members in joint efforts to deal with topics as varied as recycling and soil

  13. SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1986-02-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. It also accommodates other representations for data such as transfer function polynomials. Signal processing operations include digital filtering, auto/cross spectral density, transfer function/impulse response, convolution, Fourier transform, and inverse Fourier transform. Graphical operations provide display of signals and spectra, including plotting, cursor zoom, families of curves, and multiple viewport plots. SIG provides two user interfaces with a menu mode for occasional users and a command mode for more experienced users. Capability exits for multiple commands per line, commandmore » files with arguments, commenting lines, defining commands, automatic execution for each item in a repeat sequence, etc. SIG is presently available for VAX(VMS), VAX (BERKELEY 4.2 UNIX), SUN (BERKELEY 4.2 UNIX), DEC-20 (TOPS-20), LSI-11/23 (TSX), and DEC PRO 350 (TSX). 4 refs., 2 figs.« less

  14. NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.

    PubMed

    Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S

    2016-01-14

    Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.

  15. Segregation of biomedical waste in an South Indian tertiary care hospital

    PubMed Central

    Sengodan, Vetrivel Chezian

    2014-01-01

    Introduction: Hospital wastes pose significant public health hazard if not properly managed. Hence, it is necessary to develop and adopt optimal waste management systems in the hospitals. Material and method: Biomedical waste generated in Coimbatore Medical College Hospital was color coded (blue, yellow, and red) and the data was analyzed retrospectively on a daily basis for 3 years (January 2010-December 2012). Results: Effective segregation protocols significantly reduced biomedical waste generated from 2011 to 2012. While biomedical waste of red category was significantly higher (>50%), the category yellow was the least. Per unit (per bed per day) total biomedical waste generated was 68.5, 68.8, and 61.3 grams in 2010, 2011, and 2012, respectively. Discussion: Segregation of biomedical waste at the source of generation is the first and essential step in biomedical waste management. Continuous training, fixing the responsibility on the nursing persons, and constant supervision are the key criteria's in implementing biomedical waste segregation process, which can significantly reduce per unit biomedical waste generated. Conclusion: We highly recommend all hospitals to adopt our protocol and effectively implement them to reduce generation of biomedical waste. PMID:25097419

  16. Demodulation signal processing in multiphoton imaging

    NASA Astrophysics Data System (ADS)

    Fisher, Walter G.; Wachter, Eric A.; Piston, David W.

    2002-06-01

    Multiphoton laser scanning microscopy offers numerous advantages, but sensitivity can be seriously affected by contamination from ambient room light. Typically, this forces experiments to be performed in an absolutely dark room. Since mode-locked lasers are used to generate detectable signals, signal-processing can be used to avoid such problems by taking advantage of the pulsed characteristics of such lasers. Demodulation of the fluorescence signal generated at the mode-locked frequency can result in significant reduction of interference from ambient noise sources. Such demodulation can be readily adapted to existing microscopes by inserting appropriate processor circuitry between the detector and data collection system, yielding a more robust microscope.

  17. Real-time processing of EMG signals for bionic arm purposes

    NASA Astrophysics Data System (ADS)

    Olid Dominguez, Ferran; Wawrzyniak, Zbigniew M.

    2016-09-01

    This paper is connected with the problem of prostheses, that have always been a necessity for the human being. Bio-physiological signals from muscles, electromyographic signals have been collected, analyzed and processed in order to implement a real-time algorithm which is capable of differentiation of two different states of a bionic hand: open and closed. An algorithm for real-time electromyographic signal processing with almost no false positives is presented and it is explained that in bio-physiological experiments proper signal processing is of great importance.

  18. [Big data, medical language and biomedical terminology systems].

    PubMed

    Schulz, Stefan; López-García, Pablo

    2015-08-01

    A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.

  19. Cancer systems biology: signal processing for cancer research

    PubMed Central

    Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei

    2011-01-01

    In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts. PMID:21439242

  20. Signal-processing theory for the TurboRogue receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1995-01-01

    Signal-processing theory for the TurboRogue receiver is presented. The signal form is traced from its formation at the GPS satellite, to the receiver antenna, and then through the various stages of the receiver, including extraction of phase and delay. The analysis treats the effects of ionosphere, troposphere, signal quantization, receiver components, and system noise, covering processing in both the 'code mode' when the P code is not encrypted and in the 'P-codeless mode' when the P code is encrypted. As a possible future improvement to the current analog front end, an example of a highly digital front end is analyzed.

  1. Polarization-insensitive techniques for optical signal processing

    NASA Astrophysics Data System (ADS)

    Salem, Reza

    2006-12-01

    This thesis investigates polarization-insensitive methods for optical signal processing. Two signal processing techniques are studied: clock recovery based on two-photon absorption in silicon and demultiplexing based on cross-phase modulation in highly nonlinear fiber. The clock recovery system is tested at an 80 Gb/s data rate for both back-to-back and transmission experiments. The demultiplexer is tested at a 160 Gb/s data rate in a back-to-back experiment. We experimentally demonstrate methods for eliminating polarization dependence in both systems. Our experimental results are confirmed by theoretical and numerical analysis.

  2. Introduction to Radar Signal and Data Processing: The Opportunity

    DTIC Science & Technology

    2006-09-01

    SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of

  3. Advances in targeted proteomics and applications to biomedical research

    PubMed Central

    Shi, Tujin; Song, Ehwang; Nie, Song; Rodland, Karin D.; Liu, Tao; Qian, Wei-Jun; Smith, Richard D.

    2016-01-01

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications in human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed. PMID:27302376

  4. Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry

    NASA Technical Reports Server (NTRS)

    Hong, Yie-Ming

    1973-01-01

    Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.

  5. Algebraic signal processing theory: 2-D spatial hexagonal lattice.

    PubMed

    Pünschel, Markus; Rötteler, Martin

    2007-06-01

    We develop the framework for signal processing on a spatial, or undirected, 2-D hexagonal lattice for both an infinite and a finite array of signal samples. This framework includes the proper notions of z-transform, boundary conditions, filtering or convolution, spectrum, frequency response, and Fourier transform. In the finite case, the Fourier transform is called discrete triangle transform. Like the hexagonal lattice, this transform is nonseparable. The derivation of the framework makes it a natural extension of the algebraic signal processing theory that we recently introduced. Namely, we construct the proper signal models, given by polynomial algebras, bottom-up from a suitable definition of hexagonal space shifts using a procedure provided by the algebraic theory. These signal models, in turn, then provide all the basic signal processing concepts. The framework developed in this paper is related to Mersereau's early work on hexagonal lattices in the same way as the discrete cosine and sine transforms are related to the discrete Fourier transform-a fact that will be made rigorous in this paper.

  6. New methods of multimode fiber interferometer signal processing

    NASA Astrophysics Data System (ADS)

    Vitrik, Oleg B.; Kulchin, Yuri N.; Maxaev, Oleg G.; Kirichenko, Oleg V.; Kamenev, Oleg T.; Petrov, Yuri S.

    1995-06-01

    New methods of multimode fiber interferometers signal processing are suggested. For scheme of single fiber multimode interferometers with two excited modes, the method based on using of special fiber unit is developed. This unit provides the modes interaction and further sum optical field filtering. As a result the amplitude of output signal is modulated by external influence on interferometer. The stabilization of interferometer sensitivity is achieved by using additional special modulation of output signal. For scheme of single fiber multimode interferometers with excitation of wide mode spectrum, the signal of intermode interference is registered by photodiode matrix and then special electronic unit performs correlation processing. For elimination of temperature destabilization, the registered signal is adopted to multimode interferometers optical signal temperature changes. The achieved parameters for double mode scheme: temporary stability--0.6% per hour, sensitivity to interferometer length deviations--3,2 nm; for multimode scheme: temperature stability--(0.5%)/(K), temporary nonstability--0.2% per hour, sensitivity to interferometer length deviations--20 nm, dynamic range--35 dB.

  7. pySPACE-a signal processing and classification environment in Python.

    PubMed

    Krell, Mario M; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H; Kirchner, Elsa A; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.

  8. Seismic signal processing on heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that

  9. Digital Signal Processing Methods for Ultrasonic Echoes.

    PubMed

    Sinding, Kyle; Drapaca, Corina; Tittmann, Bernhard

    2016-04-28

    Digital signal processing has become an important component of data analysis needed in industrial applications. In particular, for ultrasonic thickness measurements the signal to noise ratio plays a major role in the accurate calculation of the arrival time. For this application a band pass filter is not sufficient since the noise level cannot be significantly decreased such that a reliable thickness measurement can be performed. This paper demonstrates the abilities of two regularization methods - total variation and Tikhonov - to filter acoustic and ultrasonic signals. Both of these methods are compared to a frequency based filtering for digitally produced signals as well as signals produced by ultrasonic transducers. This paper demonstrates the ability of the total variation and Tikhonov filters to accurately recover signals from noisy acoustic signals faster than a band pass filter. Furthermore, the total variation filter has been shown to reduce the noise of a signal significantly for signals with clear ultrasonic echoes. Signal to noise ratios have been increased over 400% by using a simple parameter optimization. While frequency based filtering is efficient for specific applications, this paper shows that the reduction of noise in ultrasonic systems can be much more efficient with regularization methods.

  10. Radioastronomic signal processing cores for the SKA radio telescope

    NASA Astrophysics Data System (ADS)

    Comorett, G.; Chiarucc, S.; Belli, C.

    Modern radio telescopes require the processing of wideband signals, with sample rates from tens of MHz to tens of GHz, and are composed from hundreds up to a million of individual antennas. Digital signal processing of these signals include digital receivers (the digital equivalent of the heterodyne receiver), beamformers, channelizers, spectrometers. FPGAs present the advantage of providing a relatively low power consumption, relative to GPUs or dedicated computers, a wide signal data path, and high interconnectivity. Efficient algorithms have been developed for these applications. Here we will review some of the signal processing cores developed for the SKA telescope. The LFAA beamformer/channelizer architecture is based on an oversampling channelizer, where the channelizer output sampling rate and channel spacing can be set independently. This is useful where an overlap between adjacent channels is required to provide an uniform spectral coverage. The architecture allows for an efficient and distributed channelization scheme, with a final resolution corresponding to a million of spectral channels, minimum leakage and high out-of-band rejection. An optimized filter design procedure is used to provide an equiripple response with a very large number of spectral channels. A wideband digital receiver has been designed in order to select the processed bandwidth of the SKA Mid receiver. The receiver extracts a 2.5 MHz bandwidth form a 14 GHz input bandwidth. The design allows for non-integer ratios between the input and output sampling rates, with a resource usage comparable to that of a conventional decimating digital receiver. Finally, some considerations on quantization of radioastronomic signals are presented. Due to the stochastic nature of the signal, quantization using few data bits is possible. Good accuracies and dynamic range are possible even with 2-3 bits, but the nonlinearity in the correlation process must be corrected in post-processing. With at least 6

  11. Liquid Argon TPC Signal Formation, Signal Processing and Hit Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baller, Bruce

    2017-03-11

    This document describes the early stage of the reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions requires knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise.

  12. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  13. Biomedical patents and ethics: a Canadian solution.

    PubMed

    Gold, E R

    2000-05-01

    World Trade Organization member states are preparing for the upcoming renegotiation of the Agreement on Trade-Related Aspects of Intellectual Property Rights. One of the important elements of that renegotiation is the ethical considerations regarding the patenting of higher life forms and their component parts (e.g. DNA and cell-lines). The interface between the genetic revolution, patentability, and ethical considerations is the subject of this article. The author identifies, explores, and critiques four possible positions Canada may adopt in respect of patentability of biomedical material. First, Canada could do nothing. This approach would mean keeping biomedical materials outside the patent system and outside the stream of commerce. Canada would simply wait for an international consensus to develop before adopting a position of its own. Second, Canada could go it alone. It could implement a policy that balances the incentive effects of patents with the need to incorporate ethical and social values into the decision-making process regarding the use of biomedical materials. In respect of this option, the author proposes a model whereby non-profit bodies would hold the exclusive rights to research, use, and exploit biomedical materials. Third, Canada could follow the United States, Europe, and Japan by providing for almost unrestricted patenting of biomedical materials. This would be the most industry-friendly alternative. The fourth and final option is to use the medicare system to promote discussion of ethical considerations involved in the use of biomedical materials. The power of provincial health agencies may be used as a lever to ensure the discussion of ethical considerations concerning the use of biomedical materials. The author concludes that the fourth and final option is the best alternative for Canada while waiting for an international consensus to emerge.

  14. Finding and accessing diagrams in biomedical publications.

    PubMed

    Kuhn, Tobias; Luong, ThaiBinh; Krauthammer, Michael

    2012-01-01

    Complex relationships in biomedical publications are often communicated by diagrams such as bar and line charts, which are a very effective way of summarizing and communicating multi-faceted data sets. Given the ever-increasing amount of published data, we argue that the precise retrieval of such diagrams is of great value for answering specific and otherwise hard-to-meet information needs. To this end, we demonstrate the use of advanced image processing and classification for identifying bar and line charts by the shape and relative location of the different image elements that make up the charts. With recall and precisions of close to 90% for the detection of relevant figures, we discuss the use of this technology in an existing biomedical image search engine, and outline how it enables new forms of literature queries over biomedical relationships that are represented in these charts.

  15. pySPACE—a signal processing and classification environment in Python

    PubMed Central

    Krell, Mario M.; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H.; Kirchner, Elsa A.; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries. PMID:24399965

  16. ISLE (Image and Signal Processing LISP Environment) reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply themore » algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.« less

  17. Assessing the practice of biomedical ontology evaluation: Gaps and opportunities.

    PubMed

    Amith, Muhammad; He, Zhe; Bian, Jiang; Lossio-Ventura, Juan Antonio; Tao, Cui

    2018-04-01

    With the proliferation of heterogeneous health care data in the last three decades, biomedical ontologies and controlled biomedical terminologies play a more and more important role in knowledge representation and management, data integration, natural language processing, as well as decision support for health information systems and biomedical research. Biomedical ontologies and controlled terminologies are intended to assure interoperability. Nevertheless, the quality of biomedical ontologies has hindered their applicability and subsequent adoption in real-world applications. Ontology evaluation is an integral part of ontology development and maintenance. In the biomedicine domain, ontology evaluation is often conducted by third parties as a quality assurance (or auditing) effort that focuses on identifying modeling errors and inconsistencies. In this work, we first organized four categorical schemes of ontology evaluation methods in the existing literature to create an integrated taxonomy. Further, to understand the ontology evaluation practice in the biomedicine domain, we reviewed a sample of 200 ontologies from the National Center for Biomedical Ontology (NCBO) BioPortal-the largest repository for biomedical ontologies-and observed that only 15 of these ontologies have documented evaluation in their corresponding inception papers. We then surveyed the recent quality assurance approaches for biomedical ontologies and their use. We also mapped these quality assurance approaches to the ontology evaluation criteria. It is our anticipation that ontology evaluation and quality assurance approaches will be more widely adopted in the development life cycle of biomedical ontologies. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Topics in Biomedical Optics: Introduction

    NASA Astrophysics Data System (ADS)

    Hebden, Jeremy C.; Boas, David A.; George, John S.; Durkin, Anthony J.

    2003-06-01

    The field of biomedical optics is experiencing tremendous growth. Biomedical technologies contribute in the creation of devices used in healthcare of various specialties (ophthalmology, cardiology, anesthesiology, and immunology, etc.). Recent research in biomedical optics is discussed. Overviews of meetings held at the 2002 Optical Society of America Biomedical Topical Meetings are presented.

  19. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  20. The biomedical disciplines and the structure of biomedical and clinical knowledge.

    PubMed

    Nederbragt, H

    2000-11-01

    The relation between biomedical knowledge and clinical knowledge is discussed by comparing their respective structures. The knowledge of a disease as a biological phenomenon is constructed by the interaction of facts and theories from the main biomedical disciplines: epidemiology, diagnostics, clinical trial, therapy development and pathogenesis. Although these facts and theories are based on probabilities and extrapolations, the interaction provides a reliable and coherent structure, comparable to a Kuhnian paradigma. In the structure of clinical knowledge, i.e. knowledge of the patient with the disease, not only biomedical knowledge contributes to the structure but also economic and social relations, ethics and personal experience. However, the interaction between each of the participating "knowledges" in clinical knowledge is not based on mutual dependency and accumulation of different arguments from each, as in biomedical knowledge, but on competition and partial exclusion. Therefore, the structure of biomedical knowledge is different from that of clinical knowledge. This difference is used as the basis for a discussion in which the place of technology, evidence-based medicine and the gap between scientific and clinical knowledge are evaluated.

  1. A new visual navigation system for exploring biomedical Open Educational Resource (OER) videos

    PubMed Central

    Zhao, Baoquan; Xu, Songhua; Lin, Shujin; Luo, Xiaonan; Duan, Lian

    2016-01-01

    Objective Biomedical videos as open educational resources (OERs) are increasingly proliferating on the Internet. Unfortunately, seeking personally valuable content from among the vast corpus of quality yet diverse OER videos is nontrivial due to limitations of today’s keyword- and content-based video retrieval techniques. To address this need, this study introduces a novel visual navigation system that facilitates users’ information seeking from biomedical OER videos in mass quantity by interactively offering visual and textual navigational clues that are both semantically revealing and user-friendly. Materials and Methods The authors collected and processed around 25 000 YouTube videos, which collectively last for a total length of about 4000 h, in the broad field of biomedical sciences for our experiment. For each video, its semantic clues are first extracted automatically through computationally analyzing audio and visual signals, as well as text either accompanying or embedded in the video. These extracted clues are subsequently stored in a metadata database and indexed by a high-performance text search engine. During the online retrieval stage, the system renders video search results as dynamic web pages using a JavaScript library that allows users to interactively and intuitively explore video content both efficiently and effectively. Results The authors produced a prototype implementation of the proposed system, which is publicly accessible at https://patentq.njit.edu/oer. To examine the overall advantage of the proposed system for exploring biomedical OER videos, the authors further conducted a user study of a modest scale. The study results encouragingly demonstrate the functional effectiveness and user-friendliness of the new system for facilitating information seeking from and content exploration among massive biomedical OER videos. Conclusion Using the proposed tool, users can efficiently and effectively find videos of interest, precisely locate

  2. A new visual navigation system for exploring biomedical Open Educational Resource (OER) videos.

    PubMed

    Zhao, Baoquan; Xu, Songhua; Lin, Shujin; Luo, Xiaonan; Duan, Lian

    2016-04-01

    Biomedical videos as open educational resources (OERs) are increasingly proliferating on the Internet. Unfortunately, seeking personally valuable content from among the vast corpus of quality yet diverse OER videos is nontrivial due to limitations of today's keyword- and content-based video retrieval techniques. To address this need, this study introduces a novel visual navigation system that facilitates users' information seeking from biomedical OER videos in mass quantity by interactively offering visual and textual navigational clues that are both semantically revealing and user-friendly. The authors collected and processed around 25 000 YouTube videos, which collectively last for a total length of about 4000 h, in the broad field of biomedical sciences for our experiment. For each video, its semantic clues are first extracted automatically through computationally analyzing audio and visual signals, as well as text either accompanying or embedded in the video. These extracted clues are subsequently stored in a metadata database and indexed by a high-performance text search engine. During the online retrieval stage, the system renders video search results as dynamic web pages using a JavaScript library that allows users to interactively and intuitively explore video content both efficiently and effectively.ResultsThe authors produced a prototype implementation of the proposed system, which is publicly accessible athttps://patentq.njit.edu/oer To examine the overall advantage of the proposed system for exploring biomedical OER videos, the authors further conducted a user study of a modest scale. The study results encouragingly demonstrate the functional effectiveness and user-friendliness of the new system for facilitating information seeking from and content exploration among massive biomedical OER videos. Using the proposed tool, users can efficiently and effectively find videos of interest, precisely locate video segments delivering personally valuable

  3. Evaluating the operational risks of biomedical waste using failure mode and effects analysis.

    PubMed

    Chen, Ying-Chu; Tsai, Pei-Yi

    2017-06-01

    The potential problems and risks of biomedical waste generation have become increasingly apparent in recent years. This study applied a failure mode and effects analysis to evaluate the operational problems and risks of biomedical waste. The microbiological contamination of biomedical waste seldom receives the attention of researchers. In this study, the biomedical waste lifecycle was divided into seven processes: Production, classification, packaging, sterilisation, weighing, storage, and transportation. Twenty main failure modes were identified in these phases and risks were assessed based on their risk priority numbers. The failure modes in the production phase accounted for the highest proportion of the risk priority number score (27.7%). In the packaging phase, the failure mode 'sharp articles not placed in solid containers' had the highest risk priority number score, mainly owing to its high severity rating. The sterilisation process is the main difference in the treatment of infectious and non-infectious biomedical waste. The failure modes in the sterilisation phase were mainly owing to human factors (mostly related to operators). This study increases the understanding of the potential problems and risks associated with biomedical waste, thereby increasing awareness of how to improve the management of biomedical waste to better protect workers, the public, and the environment.

  4. Cognitive and learning sciences in biomedical and health instructional design: A review with lessons for biomedical informatics education.

    PubMed

    Patel, Vimla L; Yoskowitz, Nicole A; Arocha, Jose F; Shortliffe, Edward H

    2009-02-01

    Theoretical and methodological advances in the cognitive and learning sciences can greatly inform curriculum and instruction in biomedicine and also educational programs in biomedical informatics. It does so by addressing issues such as the processes related to comprehension of medical information, clinical problem-solving and decision-making, and the role of technology. This paper reviews these theories and methods from the cognitive and learning sciences and their role in addressing current and future needs in designing curricula, largely using illustrative examples drawn from medical education. The lessons of this past work are also applicable, however, to biomedical and health professional curricula in general, and to biomedical informatics training, in particular. We summarize empirical studies conducted over two decades on the role of memory, knowledge organization and reasoning as well as studies of problem-solving and decision-making in medical areas that inform curricular design. The results of this research contribute to the design of more informed curricula based on empirical findings about how people learn and think, and more specifically, how expertise is developed. Similarly, the study of practice can also help to shape theories of human performance, technology-based learning, and scientific and professional collaboration that extend beyond the domain of medicine. Just as biomedical science has revolutionized health care practice, research in the cognitive and learning sciences provides a scientific foundation for education in biomedicine, the health professions, and biomedical informatics.

  5. New signal processing technique for density profile reconstruction using reflectometry.

    PubMed

    Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C

    2011-08-01

    Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.

  6. Two-dimensional signal processing with application to image restoration

    NASA Technical Reports Server (NTRS)

    Assefi, T.

    1974-01-01

    A recursive technique for modeling and estimating a two-dimensional signal contaminated by noise is presented. A two-dimensional signal is assumed to be an undistorted picture, where the noise introduces the distortion. Both the signal and the noise are assumed to be wide-sense stationary processes with known statistics. Thus, to estimate the two-dimensional signal is to enhance the picture. The picture representing the two-dimensional signal is converted to one dimension by scanning the image horizontally one line at a time. The scanner output becomes a nonstationary random process due to the periodic nature of the scanner operation. Procedures to obtain a dynamical model corresponding to the autocorrelation function of the scanner output are derived. Utilizing the model, a discrete Kalman estimator is designed to enhance the image.

  7. Simplified signal processing for impedance spectroscopy with spectrally sparse sequences

    NASA Astrophysics Data System (ADS)

    Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.

    2013-04-01

    Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.

  8. Getting to zero the biomedical way in Africa: outcomes of deliberation at the 2013 Biomedical HIV Prevention Forum in Abuja, Nigeria

    PubMed Central

    2014-01-01

    Background Over the last few decades, biomedical HIV prevention research had engaged multiple African stakeholders. There have however been few platforms to enable regional stakeholders to engage with one another. In partnership with the World AIDS Campaign International, the Institute of Public Health of Obafemi Awolowo University, and the National Agency for the Control of AIDS in Nigeria, the New HIV Vaccine and Microbicide Advocacy Society hosted a forum on biomedical HIV prevention research in Africa. Stakeholders’ present explored evidences related to biomedical HIV prevention research and development in Africa, and made recommendations to inform policy, guidelines and future research agenda. Discussion The BHPF hosted 342 participants. Topics discussed included the use of antiretrovirals for HIV prevention, considerations for biomedical HIV prevention among key populations; HIV vaccine development; HIV cure; community and civil society engagement; and ethical considerations in implementation of biomedical HIV prevention research. Participants identified challenges for implementation of proven efficacious interventions and discovery of other new prevention options for Africa. Concerns raised included limited funding by African governments, lack of cohesive advocacy and policy agenda for biomedical HIV prevention research and development by Africa, varied ethical practices, and limited support to communities’ capacity to actively engaged with clinical trial conducts. Participants recommended that the African Government implement the Abuja +12 declaration; the civil society build stronger partnerships with diverse stakeholders, and develop a coherent advocacy agenda that also enhances community research literacy; and researchers and sponsors of trials on the African continent establish a process for determining appropriate standards for trial conduct on the continent. Conclusion By highlighting key considerations for biomedical HIV prevention research and

  9. Getting to zero the biomedical way in Africa: outcomes of deliberation at the 2013 Biomedical HIV Prevention Forum in Abuja, Nigeria.

    PubMed

    Folayan, Morenike Oluwatoyin; Gottemoeller, Megan; Mburu, Rosemary; Brown, Brandon

    2014-01-01

    Over the last few decades, biomedical HIV prevention research had engaged multiple African stakeholders. There have however been few platforms to enable regional stakeholders to engage with one another. In partnership with the World AIDS Campaign International, the Institute of Public Health of Obafemi Awolowo University, and the National Agency for the Control of AIDS in Nigeria, the New HIV Vaccine and Microbicide Advocacy Society hosted a forum on biomedical HIV prevention research in Africa. Stakeholders' present explored evidences related to biomedical HIV prevention research and development in Africa, and made recommendations to inform policy, guidelines and future research agenda. The BHPF hosted 342 participants. Topics discussed included the use of antiretrovirals for HIV prevention, considerations for biomedical HIV prevention among key populations; HIV vaccine development; HIV cure; community and civil society engagement; and ethical considerations in implementation of biomedical HIV prevention research. Participants identified challenges for implementation of proven efficacious interventions and discovery of other new prevention options for Africa. Concerns raised included limited funding by African governments, lack of cohesive advocacy and policy agenda for biomedical HIV prevention research and development by Africa, varied ethical practices, and limited support to communities' capacity to actively engaged with clinical trial conducts. Participants recommended that the African Government implement the Abuja +12 declaration; the civil society build stronger partnerships with diverse stakeholders, and develop a coherent advocacy agenda that also enhances community research literacy; and researchers and sponsors of trials on the African continent establish a process for determining appropriate standards for trial conduct on the continent. By highlighting key considerations for biomedical HIV prevention research and development in Africa, the forum has

  10. All-optical signal processing using dynamic Brillouin gratings

    PubMed Central

    Santagiustina, Marco; Chin, Sanghoon; Primerov, Nicolay; Ursini, Leonora; Thévenaz, Luc

    2013-01-01

    The manipulation of dynamic Brillouin gratings in optical fibers is demonstrated to be an extremely flexible technique to achieve, with a single experimental setup, several all-optical signal processing functions. In particular, all-optical time differentiation, time integration and true time reversal are theoretically predicted, and then numerically and experimentally demonstrated. The technique can be exploited to process both photonic and ultra-wide band microwave signals, so enabling many applications in photonics and in radio science. PMID:23549159

  11. Technical editing of research reports in biomedical journals.

    PubMed

    Wager, Elizabeth; Middleton, Philippa

    2008-10-08

    Most journals try to improve their articles by technical editing processes such as proof-reading, editing to conform to 'house styles', grammatical conventions and checking accuracy of cited references. Despite the considerable resources devoted to technical editing, we do not know whether it improves the accessibility of biomedical research findings or the utility of articles. This is an update of a Cochrane methodology review first published in 2003. To assess the effects of technical editing on research reports in peer-reviewed biomedical journals, and to assess the level of accuracy of references to these reports. We searched The Cochrane Library Issue 2, 2007; MEDLINE (last searched July 2006); EMBASE (last searched June 2007) and checked relevant articles for further references. We also searched the Internet and contacted researchers and experts in the field. Prospective or retrospective comparative studies of technical editing processes applied to original research articles in biomedical journals, as well as studies of reference accuracy. Two review authors independently assessed each study against the selection criteria and assessed the methodological quality of each study. One review author extracted the data, and the second review author repeated this. We located 32 studies addressing technical editing and 66 surveys of reference accuracy. Only three of the studies were randomised controlled trials. A 'package' of largely unspecified editorial processes applied between acceptance and publication was associated with improved readability in two studies and improved reporting quality in another two studies, while another study showed mixed results after stricter editorial policies were introduced. More intensive editorial processes were associated with fewer errors in abstracts and references. Providing instructions to authors was associated with improved reporting of ethics requirements in one study and fewer errors in references in two studies, but no

  12. The Ontology for Biomedical Investigations.

    PubMed

    Bandrowski, Anita; Brinkman, Ryan; Brochhausen, Mathias; Brush, Matthew H; Bug, Bill; Chibucos, Marcus C; Clancy, Kevin; Courtot, Mélanie; Derom, Dirk; Dumontier, Michel; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Gibson, Frank; Gonzalez-Beltran, Alejandra; Haendel, Melissa A; He, Yongqun; Heiskanen, Mervi; Hernandez-Boussard, Tina; Jensen, Mark; Lin, Yu; Lister, Allyson L; Lord, Phillip; Malone, James; Manduchi, Elisabetta; McGee, Monnie; Morrison, Norman; Overton, James A; Parkinson, Helen; Peters, Bjoern; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H; Schober, Daniel; Smith, Barry; Soldatova, Larisa N; Stoeckert, Christian J; Taylor, Chris F; Torniai, Carlo; Turner, Jessica A; Vita, Randi; Whetzel, Patricia L; Zheng, Jie

    2016-01-01

    The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org) providing details on the people, policies, and issues being addressed

  13. The Ontology for Biomedical Investigations

    PubMed Central

    Bandrowski, Anita; Brinkman, Ryan; Brochhausen, Mathias; Brush, Matthew H.; Chibucos, Marcus C.; Clancy, Kevin; Courtot, Mélanie; Derom, Dirk; Dumontier, Michel; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Gibson, Frank; Gonzalez-Beltran, Alejandra; Haendel, Melissa A.; He, Yongqun; Heiskanen, Mervi; Hernandez-Boussard, Tina; Jensen, Mark; Lin, Yu; Lister, Allyson L.; Lord, Phillip; Malone, James; Manduchi, Elisabetta; McGee, Monnie; Morrison, Norman; Overton, James A.; Parkinson, Helen; Peters, Bjoern; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H.; Schober, Daniel; Smith, Barry; Soldatova, Larisa N.; Stoeckert, Christian J.; Taylor, Chris F.; Torniai, Carlo; Turner, Jessica A.; Vita, Randi; Whetzel, Patricia L.; Zheng, Jie

    2016-01-01

    The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org) providing details on the people, policies, and issues being addressed

  14. Reviewing Manuscripts for Biomedical Journals

    PubMed Central

    Garmel, Gus M

    2010-01-01

    Writing for publication is a complex task. For many professionals, producing a well-executed manuscript conveying one's research, ideas, or educational wisdom is challenging. Authors have varying emotions related to the process of writing for scientific publication. Although not studied, a relationship between an author's enjoyment of the writing process and the product's outcome is highly likely. As with any skill, practice generally results in improvements. Literature focused on preparing manuscripts for publication and the art of reviewing submissions exists. Most journals guard their reviewers' anonymity with respect to the manuscript review process. This is meant to protect them from direct or indirect author demands, which may occur during the review process or in the future. It is generally accepted that author identities are masked in the peer-review process. However, the concept of anonymity for reviewers has been debated recently; many editors consider it problematic that reviewers are not held accountable to the public for their decisions. The review process is often arduous and underappreciated, one reason why biomedical journals acknowledge editors and frequently recognize reviewers who donate their time and expertise in the name of science. This article describes essential elements of a submitted manuscript, with the hopes of improving scientific writing. It also discusses the review process within the biomedical literature, the importance of reviewers to the scientific process, responsibilities of reviewers, and qualities of a good review and reviewer. In addition, it includes useful insights to individuals who read and interpret the medical literature. PMID:20740129

  15. Parallel Signal Processing and System Simulation using aCe

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2003-01-01

    Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).

  16. Using rule-based natural language processing to improve disease normalization in biomedical text.

    PubMed

    Kang, Ning; Singh, Bharat; Afzal, Zubair; van Mulligen, Erik M; Kors, Jan A

    2013-01-01

    In order for computers to extract useful information from unstructured text, a concept normalization system is needed to link relevant concepts in a text to sources that contain further information about the concept. Popular concept normalization tools in the biomedical field are dictionary-based. In this study we investigate the usefulness of natural language processing (NLP) as an adjunct to dictionary-based concept normalization. We compared the performance of two biomedical concept normalization systems, MetaMap and Peregrine, on the Arizona Disease Corpus, with and without the use of a rule-based NLP module. Performance was assessed for exact and inexact boundary matching of the system annotations with those of the gold standard and for concept identifier matching. Without the NLP module, MetaMap and Peregrine attained F-scores of 61.0% and 63.9%, respectively, for exact boundary matching, and 55.1% and 56.9% for concept identifier matching. With the aid of the NLP module, the F-scores of MetaMap and Peregrine improved to 73.3% and 78.0% for boundary matching, and to 66.2% and 69.8% for concept identifier matching. For inexact boundary matching, performances further increased to 85.5% and 85.4%, and to 73.6% and 73.3% for concept identifier matching. We have shown the added value of NLP for the recognition and normalization of diseases with MetaMap and Peregrine. The NLP module is general and can be applied in combination with any concept normalization system. Whether its use for concept types other than disease is equally advantageous remains to be investigated.

  17. K-mean clustering algorithm for processing signals from compound semiconductor detectors

    NASA Astrophysics Data System (ADS)

    Tada, Tsutomu; Hitomi, Keitaro; Wu, Yan; Kim, Seong-Yun; Yamazaki, Hiromichi; Ishii, Keizo

    2011-12-01

    The K-mean clustering algorithm was employed for processing signal waveforms from TlBr detectors. The signal waveforms were classified based on its shape reflecting the charge collection process in the detector. The classified signal waveforms were processed individually to suppress the pulse height variation of signals due to the charge collection loss. The obtained energy resolution of a 137Cs spectrum measured with a 0.5 mm thick TlBr detector was 1.3% FWHM by employing 500 clusters.

  18. Analysis of embolic signals with directional dual tree rational dilation wavelet transform.

    PubMed

    Serbes, Gorkem; Aydin, Nizamettin

    2016-08-01

    The dyadic discrete wavelet transform (dyadic-DWT), which is based on fixed integer sampling factor, has been used before for processing piecewise smooth biomedical signals. However, the dyadic-DWT has poor frequency resolution due to the low-oscillatory nature of its wavelet bases and therefore, it is less effective in processing embolic signals (ESs). To process ESs more effectively, a wavelet transform having better frequency resolution than the dyadic-DWT is needed. Therefore, in this study two ESs, containing micro-emboli and artifact waveforms, are analyzed with the Directional Dual Tree Rational-Dilation Wavelet Transform (DDT-RADWT). The DDT-RADWT, which can be directly applied to quadrature signals, is based on rational dilation factors and has adjustable frequency resolution. The analyses are done for both low and high Q-factors. It is proved that, when high Q-factor filters are employed in the DDT-RADWT, clearer representations of ESs can be attained in decomposed sub-bands and artifacts can be successfully separated.

  19. Biomedical Telectrodes

    NASA Technical Reports Server (NTRS)

    Shepherd, C. K.

    1989-01-01

    Compact transmitters eliminate need for wires to monitors. Biomedical telectrode is small electronic package that attaches to patient in manner similar to small adhesive bandage. Patient wearing biomedical telectrodes moves freely, without risk of breaking or entangling wire connections. Especially beneficial to patients undergoing electrocardiographic monitoring in intensive-care units in hospitals. Eliminates nuisance of coping with wire connections while dressing and going to toilet.

  20. Roles and applications of biomedical ontologies in experimental animal science.

    PubMed

    Masuya, Hiroshi

    2012-01-01

    A huge amount of experimental data from past studies has played a vital role in the development of new knowledge and technologies in biomedical science. The importance of computational technologies for the reuse of data, data integration, and knowledge discoveries has also increased, providing means of processing large amounts of data. In recent years, information technologies related to "ontologies" have played more significant roles in the standardization, integration, and knowledge representation of biomedical information. This review paper outlines the history of data integration in biomedical science and its recent trends in relation to the field of experimental animal science.

  1. Finding and Accessing Diagrams in Biomedical Publications

    PubMed Central

    Kuhn, Tobias; Luong, ThaiBinh; Krauthammer, Michael

    2012-01-01

    Complex relationships in biomedical publications are often communicated by diagrams such as bar and line charts, which are a very effective way of summarizing and communicating multi-faceted data sets. Given the ever-increasing amount of published data, we argue that the precise retrieval of such diagrams is of great value for answering specific and otherwise hard-to-meet information needs. To this end, we demonstrate the use of advanced image processing and classification for identifying bar and line charts by the shape and relative location of the different image elements that make up the charts. With recall and precisions of close to 90% for the detection of relevant figures, we discuss the use of this technology in an existing biomedical image search engine, and outline how it enables new forms of literature queries over biomedical relationships that are represented in these charts. PMID:23304318

  2. Introducing meta-services for biomedical information extraction

    PubMed Central

    Leitner, Florian; Krallinger, Martin; Rodriguez-Penagos, Carlos; Hakenberg, Jörg; Plake, Conrad; Kuo, Cheng-Ju; Hsu, Chun-Nan; Tsai, Richard Tzong-Han; Hung, Hsi-Chuan; Lau, William W; Johnson, Calvin A; Sætre, Rune; Yoshida, Kazuhiro; Chen, Yan Hua; Kim, Sun; Shin, Soo-Yong; Zhang, Byoung-Tak; Baumgartner, William A; Hunter, Lawrence; Haddow, Barry; Matthews, Michael; Wang, Xinglong; Ruch, Patrick; Ehrler, Frédéric; Özgür, Arzucan; Erkan, Güneş; Radev, Dragomir R; Krauthammer, Michael; Luong, ThaiBinh; Hoffmann, Robert; Sander, Chris; Valencia, Alfonso

    2008-01-01

    We introduce the first meta-service for information extraction in molecular biology, the BioCreative MetaServer (BCMS; ). This prototype platform is a joint effort of 13 research groups and provides automatically generated annotations for PubMed/Medline abstracts. Annotation types cover gene names, gene IDs, species, and protein-protein interactions. The annotations are distributed by the meta-server in both human and machine readable formats (HTML/XML). This service is intended to be used by biomedical researchers and database annotators, and in biomedical language processing. The platform allows direct comparison, unified access, and result aggregation of the annotations. PMID:18834497

  3. Compact biomedical pulsed signal generator for bone tissue stimulation

    DOEpatents

    Kronberg, J.W.

    1993-06-08

    An apparatus for stimulating bone tissue for stimulating bone growth or treating osteoporosis by applying directly to the skin of the patient an alternating current electrical signal comprising wave forms known to simulate the piezoelectric constituents in bone. The apparatus may, by moving a switch, stimulate bone growth or treat osteoporosis, as desired. Based on low-power CMOS technology and enclosed in a moisture-resistant case shaped to fit comfortably, two astable multivibrators produce the desired waveforms. The amplitude, pulse width and pulse frequency, and the subpulse width and subpulse frequency of the waveforms are adjustable. The apparatus, preferably powered by a standard 9-volt battery, includes signal amplitude sensors and warning signals indicate an output is being produced and the battery needs to be replaced.

  4. Compact biomedical pulsed signal generator for bone tissue stimulation

    DOEpatents

    Kronberg, James W.

    1993-01-01

    An apparatus for stimulating bone tissue for stimulating bone growth or treating osteoporosis by applying directly to the skin of the patient an alternating current electrical signal comprising wave forms known to simulate the piezoelectric constituents in bone. The apparatus may, by moving a switch, stimulate bone growth or treat osteoporosis, as desired. Based on low-power CMOS technology and enclosed in a moisture-resistant case shaped to fit comfortably, two astable multivibrators produce the desired waveforms. The amplitude, pulse width and pulse frequency, and the subpulse width and subpulse frequency of the waveforms are adjustable. The apparatus, preferably powered by a standard 9-volt battery, includes signal amplitude sensors and warning signals indicate an output is being produced and the battery needs to be replaced.

  5. Concept recognition for extracting protein interaction relations from biomedical text

    PubMed Central

    Baumgartner, William A; Lu, Zhiyong; Johnson, Helen L; Caporaso, J Gregory; Paquette, Jesse; Lindemann, Anna; White, Elizabeth K; Medvedeva, Olga; Cohen, K Bretonnel; Hunter, Lawrence

    2008-01-01

    Background: Reliable information extraction applications have been a long sought goal of the biomedical text mining community, a goal that if reached would provide valuable tools to benchside biologists in their increasingly difficult task of assimilating the knowledge contained in the biomedical literature. We present an integrated approach to concept recognition in biomedical text. Concept recognition provides key information that has been largely missing from previous biomedical information extraction efforts, namely direct links to well defined knowledge resources that explicitly cement the concept's semantics. The BioCreative II tasks discussed in this special issue have provided a unique opportunity to demonstrate the effectiveness of concept recognition in the field of biomedical language processing. Results: Through the modular construction of a protein interaction relation extraction system, we present several use cases of concept recognition in biomedical text, and relate these use cases to potential uses by the benchside biologist. Conclusion: Current information extraction technologies are approaching performance standards at which concept recognition can begin to deliver high quality data to the benchside biologist. Our system is available as part of the BioCreative Meta-Server project and on the internet . PMID:18834500

  6. Distributed processing for features improvement in real-time portable medical devices.

    PubMed

    Mercado, Erwin John Saavedra

    2008-01-01

    Portable biomedical devices are being developed and incorporated in daily life. Nevertheless, their standalone capacity is diminished due to the lack of processing power required to face such duties as for example, signal artifacts robustness in EKG monitor devices. The following paper presents a multiprocessor architecture made from simple microcontrollers to provide an increase in processing performance, power consumption efficiency and lower cost.

  7. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  8. Commercialising genetically engineered animal biomedical products.

    PubMed

    Sullivan, Eddie J; Pommer, Jerry; Robl, James M

    2008-01-01

    Research over the past two decades has increased the quality and quantity of tools available to produce genetically engineered animals. The number of potentially viable biomedical products from genetically engineered animals is increasing. However, moving from cutting-edge research to development and commercialisation of a biomedical product that is useful and wanted by the public has significant challenges. Even early stage development of genetically engineered animal applications requires consideration of many steps, including quality assurance and quality control, risk management, gap analysis, founder animal establishment, cell banking, sourcing of animals and animal-derived material, animal facilities, product collection facilities and processing facilities. These steps are complicated and expensive. Biomedical applications of genetically engineered animals have had some recent successes and many applications are well into development. As researchers consider applications for their findings, having a realistic understanding of the steps involved in the development and commercialisation of a product, produced in genetically engineered animals, is useful in determining the risk of genetic modification to the animal nu. the potential public benefit of the application.

  9. A MUSIC-based method for SSVEP signal processing.

    PubMed

    Chen, Kun; Liu, Quan; Ai, Qingsong; Zhou, Zude; Xie, Sheng Quan; Meng, Wei

    2016-03-01

    The research on brain computer interfaces (BCIs) has become a hotspot in recent years because it offers benefit to disabled people to communicate with the outside world. Steady state visual evoked potential (SSVEP)-based BCIs are more widely used because of higher signal to noise ratio and greater information transfer rate compared with other BCI techniques. In this paper, a multiple signal classification based method was proposed for multi-dimensional SSVEP feature extraction. 2-second data epochs from four electrodes achieved excellent accuracy rates including idle state detection. In some asynchronous mode experiments, the recognition accuracy reached up to 100%. The experimental results showed that the proposed method attained good frequency resolution. In most situations, the recognition accuracy was higher than canonical correlation analysis, which is a typical method for multi-channel SSVEP signal processing. Also, a virtual keyboard was successfully controlled by different subjects in an unshielded environment, which proved the feasibility of the proposed method for multi-dimensional SSVEP signal processing in practical applications.

  10. Digital signal processing based on inverse scattering transform.

    PubMed

    Turitsyna, Elena G; Turitsyn, Sergei K

    2013-10-15

    Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal.

  11. Advances in targeted proteomics and applications to biomedical research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Song, Ehwang; Nie, Song

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity (Shi et al., Proteomics, 12, 1074–1092, 2012) herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications inmore » human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed.« less

  12. Aggregated Indexing of Biomedical Time Series Data

    PubMed Central

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.

    2016-01-01

    Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes. PMID:27617298

  13. Single photon laser altimeter simulator and statistical signal processing

    NASA Astrophysics Data System (ADS)

    Vacek, Michael; Prochazka, Ivan

    2013-05-01

    Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.

  14. Design of signal reception and processing system of embedded ultrasonic endoscope

    NASA Astrophysics Data System (ADS)

    Li, Ming; Yu, Feng; Zhang, Ruiqiang; Li, Yan; Chen, Xiaodong; Yu, Daoyin

    2009-11-01

    Embedded Ultrasonic Endoscope, based on embedded microprocessor and embedded real-time operating system, sends a micro ultrasonic probe into coelom through the biopsy channel of the Electronic Endoscope to get the fault histology features of digestive organs by rotary scanning, and acquires the pictures of the alimentary canal mucosal surface. At the same time, ultrasonic signals are processed by signal reception and processing system, forming images of the full histology of the digestive organs. Signal Reception and Processing System is an important component of Embedded Ultrasonic Endoscope. However, the traditional design, using multi-level amplifiers and special digital processing circuits to implement signal reception and processing, is no longer satisfying the standards of high-performance, miniaturization and low power requirements that embedded system requires, and as a result of the high noise that multi-level amplifier brought, the extraction of small signal becomes hard. Therefore, this paper presents a method of signal reception and processing based on double variable gain amplifier and FPGA, increasing the flexibility and dynamic range of the Signal Reception and Processing System, improving system noise level, and reducing power consumption. Finally, we set up the embedded experiment system, using a transducer with the center frequency of 8MHz to scan membrane samples, and display the image of ultrasonic echo reflected by each layer of membrane, with a frame rate of 5Hz, verifying the correctness of the system.

  15. A sub-microwatt asynchronous level-crossing ADC for biomedical applications.

    PubMed

    Li, Yongjia; Zhao, Duan; Serdijn, Wouter A

    2013-04-01

    A continuous-time level-crossing analog-to-digital converter (LC-ADC) for biomedical applications is presented. When compared to uniform-sampling (US) ADCs LC-ADCs generate fewer samples for various sparse biomedical signals. Lower power consumption and reduced design complexity with respect to conventional LC-ADCs are achieved due to: 1) replacing the n-bit digital-to-analog converter (DAC) with a 1-bit DAC; 2) splitting the level-crossing detections; and 3) fixing the comparison window. Designed and implemented in 0.18 μm CMOS technology, the proposed ADC uses a chip area of 220 × 203 μm(2). Operating from a supply voltage of 0.8 V, the ADC consumes 313-582 nW from 5 Hz to 5 kHz and achieves an ENOB up to 7.9 bits.

  16. Biomedical ontologies: toward scientific debate.

    PubMed

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  17. Facilitating biomedical researchers' interrogation of electronic health record data: Ideas from outside of biomedical informatics.

    PubMed

    Hruby, Gregory W; Matsoukas, Konstantina; Cimino, James J; Weng, Chunhua

    2016-04-01

    Electronic health records (EHR) are a vital data resource for research uses, including cohort identification, phenotyping, pharmacovigilance, and public health surveillance. To realize the promise of EHR data for accelerating clinical research, it is imperative to enable efficient and autonomous EHR data interrogation by end users such as biomedical researchers. This paper surveys state-of-art approaches and key methodological considerations to this purpose. We adapted a previously published conceptual framework for interactive information retrieval, which defines three entities: user, channel, and source, by elaborating on channels for query formulation in the context of facilitating end users to interrogate EHR data. We show the current progress in biomedical informatics mainly lies in support for query execution and information modeling, primarily due to emphases on infrastructure development for data integration and data access via self-service query tools, but has neglected user support needed during iteratively query formulation processes, which can be costly and error-prone. In contrast, the information science literature has offered elaborate theories and methods for user modeling and query formulation support. The two bodies of literature are complementary, implying opportunities for cross-disciplinary idea exchange. On this basis, we outline the directions for future informatics research to improve our understanding of user needs and requirements for facilitating autonomous interrogation of EHR data by biomedical researchers. We suggest that cross-disciplinary translational research between biomedical informatics and information science can benefit our research in facilitating efficient data access in life sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Implantable biomedical devices on bioresorbable substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, John A.; Kim, Dae-Hyeong; Omenetto, Fiorenzo

    Provided herein are implantable biomedical devices and methods of administering implantable biomedical devices, making implantable biomedical devices, and using implantable biomedical devices to actuate a target tissue or sense a parameter associated with the target tissue in a biological environment.

  19. Biomedical Interdisciplinary Curriculum Project: BIP (Biomedical Instrumentation Package) User's Manual.

    ERIC Educational Resources Information Center

    Biomedical Interdisciplinary Curriculum Project, Berkeley, CA.

    Described is the Biomedical Instrument Package (BIP) and its use. The BIP was developed for use in understanding colorimetry, sound, electricity, and bioelectric phenomena. It can also be used in a wide range of measurements such as current, voltage, resistance, temperature, and pH. Though it was developed primarily for use in biomedical science…

  20. Secure management of biomedical data with cryptographic hardware.

    PubMed

    Canim, Mustafa; Kantarcioglu, Murat; Malin, Bradley

    2012-01-01

    The biomedical community is increasingly migrating toward research endeavors that are dependent on large quantities of genomic and clinical data. At the same time, various regulations require that such data be shared beyond the initial collecting organization (e.g., an academic medical center). It is of critical importance to ensure that when such data are shared, as well as managed, it is done so in a manner that upholds the privacy of the corresponding individuals and the overall security of the system. In general, organizations have attempted to achieve these goals through deidentification methods that remove explicitly, and potentially, identifying features (e.g., names, dates, and geocodes). However, a growing number of studies demonstrate that deidentified data can be reidentified to named individuals using simple automated methods. As an alternative, it was shown that biomedical data could be shared, managed, and analyzed through practical cryptographic protocols without revealing the contents of any particular record. Yet, such protocols required the inclusion of multiple third parties, which may not always be feasible in the context of trust or bandwidth constraints. Thus, in this paper, we introduce a framework that removes the need for multiple third parties by collocating services to store and to process sensitive biomedical data through the integration of cryptographic hardware. Within this framework, we define a secure protocol to process genomic data and perform a series of experiments to demonstrate that such an approach can be run in an efficient manner for typical biomedical investigations.

  1. Ultrasonic Signal Processing for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Michaels, Jennifer E.; Michaels, Thomas E.

    2004-02-01

    Permanently mounted ultrasonic sensors are a key component of systems under development for structural health monitoring. Signal processing plays a critical role in the viability of such systems due to the difficulty in interpreting signals received from structures of complex geometry. This paper describes a differential feature-based approach to classifying signal changes as either "environmental" or "structural". Data are presented from piezoelectric discs bonded to an aluminum specimen subjected to both environmental changes and introduction of artificial defects. The classifier developed as part of this study was able to correctly identify artificial defects that were not part of the initial training and evaluation data sets. Central to the success of the classifier was the use of the Short Time Cross Correlation to measure coherency between the signal and reference as a function of time.

  2. Functional description of signal processing in the Rogue GPS receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1988-01-01

    Over the past year, two Rogue GPS prototype receivers have been assembled and successfully subjected to a variety of laboratory and field tests. A functional description is presented of signal processing in the Rogue receiver, tracing the signal from RF input to the output values of group delay, phase, and data bits. The receiver can track up to eight satellites, without time multiplexing among satellites or channels, simultaneously measuring both group delay and phase for each of three channels (L1-C/A, L1-P, L2-P). The Rogue signal processing described requires generation of the code for all three channels. Receiver functional design, which emphasized accuracy, reliability, flexibility, and dynamic capability, is summarized. A detailed functional description of signal processing is presented, including C/A-channel and P-channel processing, carrier-aided averaging of group delays, checks for cycle slips, acquistion, and distinctive features.

  3. Signal processor for processing ultrasonic receiver signals

    DOEpatents

    Fasching, George E.

    1980-01-01

    A signal processor is provided which uses an analog integrating circuit in conjunction with a set of digital counters controlled by a precision clock for sampling timing to provide an improved presentation of an ultrasonic transmitter/receiver signal. The signal is sampled relative to the transmitter trigger signal timing at precise times, the selected number of samples are integrated and the integrated samples are transferred and held for recording on a strip chart recorder or converted to digital form for storage. By integrating multiple samples taken at precisely the same time with respect to the trigger for the ultrasonic transmitter, random noise, which is contained in the ultrasonic receiver signal, is reduced relative to the desired useful signal.

  4. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units.

    PubMed

    Rath, N; Kato, S; Levesque, J P; Mauel, M E; Navratil, G A; Peng, Q

    2014-04-01

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.

  5. Hybrid Discrete Wavelet Transform and Gabor Filter Banks Processing for Features Extraction from Biomedical Images

    PubMed Central

    Lahmiri, Salim; Boukadoum, Mounir

    2013-01-01

    A new methodology for automatic feature extraction from biomedical images and subsequent classification is presented. The approach exploits the spatial orientation of high-frequency textural features of the processed image as determined by a two-step process. First, the two-dimensional discrete wavelet transform (DWT) is applied to obtain the HH high-frequency subband image. Then, a Gabor filter bank is applied to the latter at different frequencies and spatial orientations to obtain new Gabor-filtered image whose entropy and uniformity are computed. Finally, the obtained statistics are fed to a support vector machine (SVM) binary classifier. The approach was validated on mammograms, retina, and brain magnetic resonance (MR) images. The obtained classification accuracies show better performance in comparison to common approaches that use only the DWT or Gabor filter banks for feature extraction. PMID:27006906

  6. A Diagram Editor for Efficient Biomedical Knowledge Capture and Integration

    PubMed Central

    Yu, Bohua; Jakupovic, Elvis; Wilson, Justin; Dai, Manhong; Xuan, Weijian; Mirel, Barbara; Athey, Brian; Watson, Stanley; Meng, Fan

    2008-01-01

    Understanding the molecular mechanisms underlying complex disorders requires the integration of data and knowledge from different sources including free text literature and various biomedical databases. To facilitate this process, we created the Biomedical Concept Diagram Editor (BCDE) to help researchers distill knowledge from data and literature and aid the process of hypothesis development. A key feature of BCDE is the ability to capture information with a simple drag-and-drop. This is a vast improvement over manual methods of knowledge and data recording and greatly increases the efficiency of the biomedical researcher. BCDE also provides a unique concept matching function to enforce consistent terminology, which enables conceptual relationships deposited by different researchers in the BCDE database to be mined and integrated for intelligible and useful results. We hope BCDE will promote the sharing and integration of knowledge from different researchers for effective hypothesis development. PMID:21347131

  7. Digital processing of RF signals from optical frequency combs

    NASA Astrophysics Data System (ADS)

    Cizek, Martin; Smid, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Cip, Ondřej

    2013-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Secondly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique used for assessing the offset and repetition frequencies of the comb, resulting in digital servo-loop stabilization of the fs comb. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset frequency of the fs comb.

  8. The Vector, Signal, and Image Processing Library (VSIPL): an Open Standard for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Kepner, J. V.; Janka, R. S.; Lebak, J.; Richards, M. A.

    1999-12-01

    The Vector/Signal/Image Processing Library (VSIPL) is a DARPA initiated effort made up of industry, government and academic representatives who have defined an industry standard API for vector, signal, and image processing primitives for real-time signal processing on high performance systems. VSIPL supports a wide range of data types (int, float, complex, ...) and layouts (vectors, matrices and tensors) and is ideal for astronomical data processing. The VSIPL API is intended to serve as an open, vendor-neutral, industry standard interface. The object-based VSIPL API abstracts the memory architecture of the underlying machine by using the concept of memory blocks and views. Early experiments with VSIPL code conversions have been carried out by the High Performance Computing Program team at the UCSD. Commercially, several major vendors of signal processors are actively developing implementations. VSIPL has also been explicitly required as part of a recent Rome Labs teraflop procurement. This poster presents the VSIPL API, its functionality and the status of various implementations.

  9. The modeling of MMI structures for signal processing applications

    NASA Astrophysics Data System (ADS)

    Le, Thanh Trung; Cahill, Laurence W.

    2008-02-01

    Microring resonators are promising candidates for photonic signal processing applications. However, almost all resonators that have been reported so far use directional couplers or 2×2 multimode interference (MMI) couplers as the coupling element between the ring and the bus waveguides. In this paper, instead of using 2×2 couplers, novel structures for microring resonators based on 3×3 MMI couplers are proposed. The characteristics of the device are derived using the modal propagation method. The device parameters are optimized by using numerical methods. Optical switches and filters using Silicon on Insulator (SOI) then have been designed and analyzed. This device can become a new basic component for further applications in optical signal processing. The paper concludes with some further examples of photonic signal processing circuits based on MMI couplers.

  10. Nanoparticles for Biomedical Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nune, Satish K.; Gunda, Padmaja; Thallapally, Praveen K.

    2009-11-01

    Background: Synthetic nanoparticles are emerging as versatile tools in biomedical applications, particularly in the area of biomedical imaging. Nanoparticles 1 to 100 nm in diameter possess dimensions comparable to biological functional units. Diverse surface chemistries, unique magnetic properties, tunable absorption and emission properties, and recent advances in the synthesis and engineering of various nanoparticles suggest their potential as probes for early detection of diseases such as cancer. Surface functionalization has further expanded the potential of nanoparticles as probes for molecular imaging. Objective: To summarize emerging research of nanoparticles for biomedical imaging with increased selectivity and reduced non-specific uptake with increasedmore » spatial resolution containing stabilizers conjugated with targeting ligands. Methods: This review summarizes recent technological advances in the synthesis of various nanoparticle probes, and surveys methods to improve the targeting of nanoparticles for their applications in biomedical imaging. Conclusion: Structural design of nanomaterials for biomedical imaging continues to expand and diversify. Synthetic methods have aimed to control the size and surface characteristics of nanoparticles to control distribution, half-life and elimination. Although molecular imaging applications using nanoparticles are advancing into clinical applications, challenges such as storage stability and long-term toxicology should continue to be addressed. Keywords: nanoparticle synthesis, surface modification, targeting, molecular imaging, and biomedical imaging.« less

  11. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    PubMed

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. CAVEman: Standardized anatomical context for biomedical data mapping.

    PubMed

    Turinsky, Andrei L; Fanea, Elena; Trinh, Quang; Wat, Stephen; Hallgrímsson, Benedikt; Dong, Xiaoli; Shu, Xueling; Stromer, Julie N; Hill, Jonathan W; Edwards, Carol; Grosenick, Brenda; Yajima, Masumi; Sensen, Christoph W

    2008-01-01

    The authors have created a software system called the CAVEman, for the visual integration and exploration of heterogeneous anatomical and biomedical data. The CAVEman can be applied for both education and research tasks. The main component of the system is a three-dimensional digital atlas of the adult male human anatomy, structured according to the nomenclature of Terminologia Anatomica. The underlying data-indexing mechanism uses standard ontologies to map a range of biomedical data types onto the atlas. The CAVEman system is now used to visualize genetic processes in the context of the human anatomy and to facilitate visual exploration of the data. Through the use of Javatrade mark software, the atlas-based system is portable to virtually any computer environment, including personal computers and workstations. Existing Java tools for biomedical data analysis have been incorporated into the system. The affordability of virtual-reality installations has increased dramatically over the last several years. This creates new opportunities for educational scenarios that model important processes in a patient's body, including gene expression patterns, metabolic activity, the effects of interventions such as drug treatments, and eventually surgical simulations.

  13. Spaceborne synthetic aperture radar signal processing using FPGAs

    NASA Astrophysics Data System (ADS)

    Sugimoto, Yohei; Ozawa, Satoru; Inaba, Noriyasu

    2017-10-01

    Synthetic Aperture Radar (SAR) imagery requires image reproduction through successive signal processing of received data before browsing images and extracting information. The received signal data records of the ALOS-2/PALSAR-2 are stored in the onboard mission data storage and transmitted to the ground. In order to compensate the storage usage and the capacity of transmission data through the mission date communication networks, the operation duty of the PALSAR-2 is limited. This balance strongly relies on the network availability. The observation operations of the present spaceborne SAR systems are rigorously planned by simulating the mission data balance, given conflicting user demands. This problem should be solved such that we do not have to compromise the operations and the potential of the next-generation spaceborne SAR systems. One of the solutions is to compress the SAR data through onboard image reproduction and information extraction from the reproduced images. This is also beneficial for fast delivery of information products and event-driven observations by constellation. The Emergence Studio (Sōhatsu kōbō in Japanese) with Japan Aerospace Exploration Agency is developing evaluation models of FPGA-based signal processing system for onboard SAR image reproduction. The model, namely, "Fast L1 Processor (FLIP)" developed in 2016 can reproduce a 10m-resolution single look complex image (Level 1.1) from ALOS/PALSAR raw signal data (Level 1.0). The processing speed of the FLIP at 200 MHz results in twice faster than CPU-based computing at 3.7 GHz. The image processed by the FLIP is no way inferior to the image processed with 32-bit computing in MATLAB.

  14. Laser surface texturing of polymers for biomedical applications

    NASA Astrophysics Data System (ADS)

    Riveiro, Antonio; Maçon, Anthony L. B.; del Val, Jesus; Comesaña, Rafael; Pou, Juan

    2018-02-01

    Polymers are materials widely used in biomedical science because of their biocompatibility, and good mechanical properties (which, in some cases, are similar to those of human tissues); however, these materials are, in general, chemically and biologically inert. Surface characteristics, such as topography (at the macro-, micro, and nanoscale), surface chemistry, surface energy, charge or wettability are interrelated properties, and they cooperatively influence the biological performance of materials when used for biomedical applications. They regulate the biological response at the implant/tissue interface (e.g., influencing the cell adhesion, cell orientation, cell motility, etc.). Several surface processing techniques have been explored to modulate these properties for biomedical applications. Despite their potentials, these methods have limitations that prevent their applicability. In this regard, laser-based methods, in particular laser surface texturing (LST), can be an interesting alternative. Different works have showed the potentiality of this technique to control the surface properties of biomedical polymers and enhance their biological performance; however, more research is needed to obtain the desired biological response. This work provides a general overview of the basics and applications of LST for the surface modification of polymers currently used in the clinical practice (e.g. PEEK, UHMWPE, PP, etc.). The modification of roughness, wettability, and their impact on the biological response is addressed to offer new insights on the surface modification of biomedical polymers.

  15. Radar transponder apparatus and signal processing technique

    DOEpatents

    Axline, Jr., Robert M.; Sloan, George R.; Spalding, Richard E.

    1996-01-01

    An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder's echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR.

  16. Radar transponder apparatus and signal processing technique

    DOEpatents

    Axline, R.M. Jr.; Sloan, G.R.; Spalding, R.E.

    1996-01-23

    An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder`s echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR. 4 figs.

  17. Smart Polymeric Gels: Redefining the Limits of Biomedical Devices.

    PubMed

    Chaterji, Somali; Kwon, Il Keun; Park, Kinam

    2007-08-01

    This review describes recent progresses in the development and applications of smart polymeric gels, especially in the context of biomedical devices. The review has been organized into three separate sections: defining the basis of smart properties in polymeric gels; describing representative stimuli to which these gels respond; and illustrating a sample application area, namely, microfluidics. One of the major limitations in the use of hydrogels in stimuli-responsive applications is the diffusion rate limited transduction of signals. This can be obviated by engineering interconnected pores in the polymer structure to form capillary networks in the matrix and by downscaling the size of hydrogels to significantly decrease diffusion paths. Reducing the lag time in the induction of smart responses can be highly useful in biomedical devices, such as sensors and actuators. This review also describes molecular imprinting techniques to fabricate hydrogels for specific molecular recognition of target analytes. Additionally, it describes the significant advances in bottom-up nanofabrication strategies, involving supramolecular chemistry. Learning to assemble supramolecular structures from nature has led to the rapid prototyping of functional supramolecular devices. In essence, the barriers in the current performance potential of biomedical devices can be lowered or removed by the rapid convergence of interdisciplinary technologies.

  18. Smart Polymeric Gels: Redefining the Limits of Biomedical Devices

    PubMed Central

    Chaterji, Somali; Kwon, Il Keun; Park, Kinam

    2007-01-01

    This review describes recent progresses in the development and applications of smart polymeric gels, especially in the context of biomedical devices. The review has been organized into three separate sections: defining the basis of smart properties in polymeric gels; describing representative stimuli to which these gels respond; and illustrating a sample application area, namely, microfluidics. One of the major limitations in the use of hydrogels in stimuli–responsive applications is the diffusion rate limited transduction of signals. This can be obviated by engineering interconnected pores in the polymer structure to form capillary networks in the matrix and by downscaling the size of hydrogels to significantly decrease diffusion paths. Reducing the lag time in the induction of smart responses can be highly useful in biomedical devices, such as sensors and actuators. This review also describes molecular imprinting techniques to fabricate hydrogels for specific molecular recognition of target analytes. Additionally, it describes the significant advances in bottom–up nanofabrication strategies, involving supramolecular chemistry. Learning to assemble supramolecular structures from nature has led to the rapid prototyping of functional supramolecular devices. In essence, the barriers in the current performance potential of biomedical devices can be lowered or removed by the rapid convergence of interdisciplinary technologies. PMID:18670584

  19. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  20. Deterring watermark collusion attacks using signal processing techniques

    NASA Astrophysics Data System (ADS)

    Lemma, Aweke N.; van der Veen, Michiel

    2007-02-01

    Collusion attack is a malicious watermark removal attack in which the hacker has access to multiple copies of the same content with different watermarks and tries to remove the watermark using averaging. In the literature, several solutions to collusion attacks have been reported. The main stream solutions aim at designing watermark codes that are inherently resistant to collusion attacks. The other approaches propose signal processing based solutions that aim at modifying the watermarked signals in such a way that averaging multiple copies of the content leads to a significant degradation of the content quality. In this paper, we present signal processing based technique that may be deployed for deterring collusion attacks. We formulate the problem in the context of electronic music distribution where the content is generally available in the compressed domain. Thus, we first extend the collusion resistance principles to bit stream signals and secondly present experimental based analysis to estimate a bound on the maximum number of modified versions of a content that satisfy good perceptibility requirement on one hand and destructive averaging property on the other hand.

  1. Single-channel mixed signal blind source separation algorithm based on multiple ICA processing

    NASA Astrophysics Data System (ADS)

    Cheng, Xiefeng; Li, Ji

    2017-01-01

    Take separating the fetal heart sound signal from the mixed signal that get from the electronic stethoscope as the research background, the paper puts forward a single-channel mixed signal blind source separation algorithm based on multiple ICA processing. Firstly, according to the empirical mode decomposition (EMD), the single-channel mixed signal get multiple orthogonal signal components which are processed by ICA. The multiple independent signal components are called independent sub component of the mixed signal. Then by combining with the multiple independent sub component into single-channel mixed signal, the single-channel signal is expanded to multipath signals, which turns the under-determined blind source separation problem into a well-posed blind source separation problem. Further, the estimate signal of source signal is get by doing the ICA processing. Finally, if the separation effect is not very ideal, combined with the last time's separation effect to the single-channel mixed signal, and keep doing the ICA processing for more times until the desired estimated signal of source signal is get. The simulation results show that the algorithm has good separation effect for the single-channel mixed physiological signals.

  2. Generation of optical OFDM signals using 21.4 GS/s real time digital signal processing.

    PubMed

    Benlachtar, Yannis; Watts, Philip M; Bouziane, Rachid; Milder, Peter; Rangaraj, Deepak; Cartolano, Anthony; Koutsoyannis, Robert; Hoe, James C; Püschel, Markus; Glick, Madeleine; Killey, Robert I

    2009-09-28

    We demonstrate a field programmable gate array (FPGA) based optical orthogonal frequency division multiplexing (OFDM) transmitter implementing real time digital signal processing at a sample rate of 21.4 GS/s. The QPSK-OFDM signal is generated using an 8 bit, 128 point inverse fast Fourier transform (IFFT) core, performing one transform per clock cycle at a clock speed of 167.2 MHz and can be deployed with either a direct-detection or a coherent receiver. The hardware design and the main digital signal processing functions are described, and we show that the main performance limitation is due to the low (4-bit) resolution of the digital-to-analog converter (DAC) and the 8-bit resolution of the IFFT core used. We analyze the back-to-back performance of the transmitter generating an 8.36 Gb/s optical single sideband (SSB) OFDM signal using digital up-conversion, suitable for direct-detection. Additionally, we use the device to transmit 8.36 Gb/s SSB OFDM signals over 200 km of uncompensated standard single mode fiber achieving an overall BER<10(-3).

  3. Utilization of ontology look-up services in information retrieval for biomedical literature.

    PubMed

    Vishnyakova, Dina; Pasche, Emilie; Lovis, Christian; Ruch, Patrick

    2013-01-01

    With the vast amount of biomedical data we face the necessity to improve information retrieval processes in biomedical domain. The use of biomedical ontologies facilitated the combination of various data sources (e.g. scientific literature, clinical data repository) by increasing the quality of information retrieval and reducing the maintenance efforts. In this context, we developed Ontology Look-up services (OLS), based on NEWT and MeSH vocabularies. Our services were involved in some information retrieval tasks such as gene/disease normalization. The implementation of OLS services significantly accelerated the extraction of particular biomedical facts by structuring and enriching the data context. The results of precision in normalization tasks were boosted on about 20%.

  4. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  5. Complexity of EEG-signal in Time Domain - Possible Biomedical Application

    NASA Astrophysics Data System (ADS)

    Klonowski, Wlodzimierz; Olejarczyk, Elzbieta; Stepien, Robert

    2002-07-01

    Human brain is a highly complex nonlinear system. So it is not surprising that in analysis of EEG-signal, which represents overall activity of the brain, the methods of Nonlinear Dynamics (or Chaos Theory as it is commonly called) can be used. Even if the signal is not chaotic these methods are a motivating tool to explore changes in brain activity due to different functional activation states, e.g. different sleep stages, or to applied therapy, e.g. exposure to chemical agents (drugs) and physical factors (light, magnetic field). The methods supplied by Nonlinear Dynamics reveal signal characteristics that are not revealed by linear methods like FFT. Better understanding of principles that govern dynamics and complexity of EEG-signal can help to find `the signatures' of different physiological and pathological states of human brain, quantitative characteristics that may find applications in medical diagnostics.

  6. Nuclear sensor signal processing circuit

    DOEpatents

    Kallenbach, Gene A [Bosque Farms, NM; Noda, Frank T [Albuquerque, NM; Mitchell, Dean J [Tijeras, NM; Etzkin, Joshua L [Albuquerque, NM

    2007-02-20

    An apparatus and method are disclosed for a compact and temperature-insensitive nuclear sensor that can be calibrated with a non-hazardous radioactive sample. The nuclear sensor includes a gamma ray sensor that generates tail pulses from radioactive samples. An analog conditioning circuit conditions the tail-pulse signals from the gamma ray sensor, and a tail-pulse simulator circuit generates a plurality of simulated tail-pulse signals. A computer system processes the tail pulses from the gamma ray sensor and the simulated tail pulses from the tail-pulse simulator circuit. The nuclear sensor is calibrated under the control of the computer. The offset is adjusted using the simulated tail pulses. Since the offset is set to zero or near zero, the sensor gain can be adjusted with a non-hazardous radioactive source such as, for example, naturally occurring radiation and potassium chloride.

  7. Dual-process theory and signal-detection theory of recognition memory.

    PubMed

    Wixted, John T

    2007-01-01

    Two influential models of recognition memory, the unequal-variance signal-detection model and a dual-process threshold/detection model, accurately describe the receiver operating characteristic, but only the latter model can provide estimates of recollection and familiarity. Such estimates often accord with those provided by the remember-know procedure, and both methods are now widely used in the neuroscience literature to identify the brain correlates of recollection and familiarity. However, in recent years, a substantial literature has accumulated directly contrasting the signal-detection model against the threshold/detection model, and that literature is almost unanimous in its endorsement of signal-detection theory. A dual-process version of signal-detection theory implies that individual recognition decisions are not process pure, and it suggests new ways to investigate the brain correlates of recognition memory. ((c) 2007 APA, all rights reserved).

  8. Coherent broadband sonar signal processing with the environmentally corrected matched filter

    NASA Astrophysics Data System (ADS)

    Camin, Henry John, III

    The matched filter is the standard approach for coherently processing active sonar signals, where knowledge of the transmitted waveform is used in the detection and parameter estimation of received echoes. Matched filtering broadband signals provides higher levels of range resolution and reverberation noise suppression than can be realized through narrowband processing. Since theoretical processing gains are proportional to the signal bandwidth, it is typically desirable to utilize the widest band signals possible. However, as signal bandwidth increases, so do environmental effects that tend to decrease correlation between the received echo and the transmitted waveform. This is especially true for ultra wideband signals, where the bandwidth exceeds an octave or approximately 70% fractional bandwidth. This loss of coherence often results in processing gains and range resolution much lower than theoretically predicted. Wiener filtering, commonly used in image processing to improve distorted and noisy photos, is investigated in this dissertation as an approach to correct for these environmental effects. This improved signal processing, Environmentally Corrected Matched Filter (ECMF), first uses a Wiener filter to estimate the environmental transfer function and then again to correct the received signal using this estimate. This process can be viewed as a smarter inverse or whitening filter that adjusts behavior according to the signal to noise ratio across the spectrum. Though the ECMF is independent of bandwidth, it is expected that ultra wideband signals will see the largest improvement, since they tend to be more impacted by environmental effects. The development of the ECMF and demonstration of improved parameter estimation with its use are the primary emphases in this dissertation. Additionally, several new contributions to the field of sonar signal processing made in conjunction with the development of the ECMF are described. A new, nondimensional wideband

  9. Implantable biomedical devices on bioresorbable substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, John A; Kim, Dae-Hyeong; Omenetto, Fiorenzo

    Provided herein are implantable biomedical devices, methods of administering implantable biomedical devices, methods of making implantable biomedical devices, and methods of using implantable biomedical devices to actuate a target tissue or sense a parameter associated with the target tissue in a biological environment. Each implantable biomedical device comprises a bioresorbable substrate, an electronic device having a plurality of inorganic semiconductor components supported by the bioresorbable substrate, and a barrier layer encapsulating at least a portion of the inorganic semiconductor components. Upon contact with a biological environment the bioresorbable substrate is at least partially resorbed, thereby establishing conformal contact between themore » implantable biomedical device and the target tissue in the biological environment.« less

  10. Isospin equilibration processes and dipolar signals: Coherent cluster production

    NASA Astrophysics Data System (ADS)

    Papa, M.; Berceanu, I.; Acosta, L.; Agodi, C.; Auditore, L.; Cardella, G.; Chatterjee, M. B.; Dell'Aquila, D.; De Filippo, E.; Francalanza, L.; Lanzalone, G.; Lombardo, I.; Maiolino, C.; Martorana, N.; Pagano, A.; Pagano, E. V.; Pirrone, S.; Politi, G.; Quattrocchi, L.; Rizzo, F.; Russotto, P.; Trifiró, A.; Trimarchi, M.; Verde, G.; Vigilante, M.

    2017-11-01

    The total dipolar signal related to multi-break-up processes induced on the system ^{48}Ca +{^{27}Al} at 40MeV/nucleon has been investigated with the CHIMERA multi-detector. Experimental data related to semi-peripheral collisions are shown and compared with CoMD-III calculations. The strong connection between the dipolar signal as obtained from the detected fragments and the dynamics of the isospin equilibration processes is also shortly discussed.

  11. Genetically engineered livestock for biomedical models.

    PubMed

    Rogers, Christopher S

    2016-06-01

    To commemorate Transgenic Animal Research Conference X, this review summarizes the recent progress in developing genetically engineered livestock species as biomedical models. The first of these conferences was held in 1997, which turned out to be a watershed year for the field, with two significant events occurring. One was the publication of the first transgenic livestock animal disease model, a pig with retinitis pigmentosa. Before that, the use of livestock species in biomedical research had been limited to wild-type animals or disease models that had been induced or were naturally occurring. The second event was the report of Dolly, a cloned sheep produced by somatic cell nuclear transfer. Cloning subsequently became an essential part of the process for most of the models developed in the last 18 years and is stilled used prominently today. This review is intended to highlight the biomedical modeling achievements that followed those key events, many of which were first reported at one of the previous nine Transgenic Animal Research Conferences. Also discussed are the practical challenges of utilizing livestock disease models now that the technical hurdles of model development have been largely overcome.

  12. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells

    PubMed Central

    Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X.; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  13. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    PubMed

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  14. Secure Management of Biomedical Data With Cryptographic Hardware

    PubMed Central

    Canim, Mustafa; Kantarcioglu, Murat; Malin, Bradley

    2014-01-01

    The biomedical community is increasingly migrating toward research endeavors that are dependent on large quantities of genomic and clinical data. At the same time, various regulations require that such data be shared beyond the initial collecting organization (e.g., an academic medical center). It is of critical importance to ensure that when such data are shared, as well as managed, it is done so in a manner that upholds the privacy of the corresponding individuals and the overall security of the system. In general, organizations have attempted to achieve these goals through deidentification methods that remove explicitly, and potentially, identifying features (e.g., names, dates, and geocodes). However, a growing number of studies demonstrate that deidentified data can be reidentified to named individuals using simple automated methods. As an alternative, it was shown that biomedical data could be shared, managed, and analyzed through practical cryptographic protocols without revealing the contents of any particular record. Yet, such protocols required the inclusion of multiple third parties, which may not always be feasible in the context of trust or bandwidth constraints. Thus, in this paper, we introduce a framework that removes the need for multiple third parties by collocating services to store and to process sensitive biomedical data through the integration of cryptographic hardware. Within this framework, we define a secure protocol to process genomic data and perform a series of experiments to demonstrate that such an approach can be run in an efficient manner for typical biomedical investigations. PMID:22010157

  15. Maglev Train Signal Processing Architecture Based on Nonlinear Discrete Tracking Differentiator.

    PubMed

    Wang, Zhiqiang; Li, Xiaolong; Xie, Yunde; Long, Zhiqiang

    2018-05-24

    In a maglev train levitation system, signal processing plays an important role for the reason that some sensor signals are prone to be corrupted by noise due to the harsh installation and operation environment of sensors and some signals cannot be acquired directly via sensors. Based on these concerns, an architecture based on a new type of nonlinear second-order discrete tracking differentiator is proposed. The function of this signal processing architecture includes filtering signal noise and acquiring needed signals for levitation purposes. The proposed tracking differentiator possesses the advantages of quick convergence, no fluttering, and simple calculation. Tracking differentiator's frequency characteristics at different parameter values are studied in this paper. The performance of this new type of tracking differentiator is tested in a MATLAB simulation and this tracking-differentiator is implemented in Very-High-Speed Integrated Circuit Hardware Description Language (VHDL). In the end, experiments are conducted separately on a test board and a maglev train model. Simulation and experiment results show that the performance of this novel signal processing architecture can fulfill the real system requirement.

  16. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  17. Inertial processing of vestibulo-ocular signals

    NASA Technical Reports Server (NTRS)

    Hess, B. J.; Angelaki, D. E.

    1999-01-01

    New evidence for a central resolution of gravito-inertial signals has been recently obtained by analyzing the properties of the vestibulo-ocular reflex (VOR) in response to combined lateral translations and roll tilts of the head. It is found that the VOR generates robust compensatory horizontal eye movements independent of whether or not the interaural translatory acceleration component is canceled out by a gravitational acceleration component due to simultaneous roll-tilt. This response property of the VOR depends on functional semicircular canals, suggesting that the brain uses both otolith and semicircular canal signals to estimate head motion relative to inertial space. Vestibular information about dynamic head attitude relative to gravity is the basis for computing head (and body) angular velocity relative to inertial space. Available evidence suggests that the inertial vestibular system controls both head attitude and velocity with respect to a gravity-centered reference frame. The basic computational principles underlying the inertial processing of otolith and semicircular canal afferent signals are outlined.

  18. Directional dual-tree complex wavelet packet transforms for processing quadrature signals.

    PubMed

    Serbes, Gorkem; Gulcur, Halil Ozcan; Aydin, Nizamettin

    2016-03-01

    Quadrature signals containing in-phase and quadrature-phase components are used in many signal processing applications in every field of science and engineering. Specifically, Doppler ultrasound systems used to evaluate cardiovascular disorders noninvasively also result in quadrature format signals. In order to obtain directional blood flow information, the quadrature outputs have to be preprocessed using methods such as asymmetrical and symmetrical phasing filter techniques. These resultant directional signals can be employed in order to detect asymptomatic embolic signals caused by small emboli, which are indicators of a possible future stroke, in the cerebral circulation. Various transform-based methods such as Fourier and wavelet were frequently used in processing embolic signals. However, most of the times, the Fourier and discrete wavelet transforms are not appropriate for the analysis of embolic signals due to their non-stationary time-frequency behavior. Alternatively, discrete wavelet packet transform can perform an adaptive decomposition of the time-frequency axis. In this study, directional discrete wavelet packet transforms, which have the ability to map directional information while processing quadrature signals and have less computational complexity than the existing wavelet packet-based methods, are introduced. The performances of proposed methods are examined in detail by using single-frequency, synthetic narrow-band, and embolic quadrature signals.

  19. Semantic Similarity in Biomedical Ontologies

    PubMed Central

    Pesquita, Catia; Faria, Daniel; Falcão, André O.; Lord, Phillip; Couto, Francisco M.

    2009-01-01

    In recent years, ontologies have become a mainstream topic in biomedical research. When biological entities are described using a common schema, such as an ontology, they can be compared by means of their annotations. This type of comparison is called semantic similarity, since it assesses the degree of relatedness between two entities by the similarity in meaning of their annotations. The application of semantic similarity to biomedical ontologies is recent; nevertheless, several studies have been published in the last few years describing and evaluating diverse approaches. Semantic similarity has become a valuable tool for validating the results drawn from biomedical studies such as gene clustering, gene expression data analysis, prediction and validation of molecular interactions, and disease gene prioritization. We review semantic similarity measures applied to biomedical ontologies and propose their classification according to the strategies they employ: node-based versus edge-based and pairwise versus groupwise. We also present comparative assessment studies and discuss the implications of their results. We survey the existing implementations of semantic similarity measures, and we describe examples of applications to biomedical research. This will clarify how biomedical researchers can benefit from semantic similarity measures and help them choose the approach most suitable for their studies. Biomedical ontologies are evolving toward increased coverage, formality, and integration, and their use for annotation is increasingly becoming a focus of both effort by biomedical experts and application of automated annotation procedures to create corpora of higher quality and completeness than are currently available. Given that semantic similarity measures are directly dependent on these evolutions, we can expect to see them gaining more relevance and even becoming as essential as sequence similarity is today in biomedical research. PMID:19649320

  20. Keeping Signals Straight: How Cells Process Information and Make Decisions

    PubMed Central

    Laub, Michael T.

    2016-01-01

    As we become increasingly dependent on electronic information-processing systems at home and work, it’s easy to lose sight of the fact that our very survival depends on highly complex biological information-processing systems. Each of the trillions of cells that form the human body has the ability to detect and respond to a wide range of stimuli and inputs, using an extraordinary set of signaling proteins to process this information and make decisions accordingly. Indeed, cells in all organisms rely on these signaling proteins to survive and proliferate in unpredictable and sometimes rapidly changing environments. But how exactly do these proteins relay information within cells, and how do they keep a multitude of incoming signals straight? Here, I describe recent efforts to understand the fidelity of information flow inside cells. This work is providing fundamental insight into how cells function. Additionally, it may lead to the design of novel antibiotics that disrupt the signaling of pathogenic bacteria or it could help to guide the treatment of cancer, which often involves information-processing gone awry inside human cells. PMID:27427909

  1. Biomedical and Behavioral Research Scientists: Their Training and Supply. Volume 1: Findings.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Office of Scientific and Engineering Personnel.

    This is the first of three volumes which presents the Committee on Biomedical and Behavioral Research Personnel's examination of the educational process that leads to doctoral degrees in biomedical and behavioral science (and to postdoctoral study in some cases) and the role of the National Research Service Awards (NRSA) training programs in it.…

  2. Biomedically relevant chemical and physical properties of coal combustion products.

    PubMed Central

    Fisher, G L

    1983-01-01

    The evaluation of the potential public and occupational health hazards of developing and existing combustion processes requires a detailed understanding of the physical and chemical properties of effluents available for human and environmental exposures. These processes produce complex mixtures of gases and aerosols which may interact synergistically or antagonistically with biological systems. Because of the physicochemical complexity of the effluents, the biomedically relevant properties of these materials must be carefully assessed. Subsequent to release from combustion sources, environmental interactions further complicate assessment of the toxicity of combustion products. This report provides an overview of the biomedically relevant physical and chemical properties of coal fly ash. Coal fly ash is presented as a model complex mixture for health and safety evaluation of combustion processes. PMID:6337824

  3. Text mining patents for biomedical knowledge.

    PubMed

    Rodriguez-Esteban, Raul; Bundschus, Markus

    2016-06-01

    Biomedical text mining of scientific knowledge bases, such as Medline, has received much attention in recent years. Given that text mining is able to automatically extract biomedical facts that revolve around entities such as genes, proteins, and drugs, from unstructured text sources, it is seen as a major enabler to foster biomedical research and drug discovery. In contrast to the biomedical literature, research into the mining of biomedical patents has not reached the same level of maturity. Here, we review existing work and highlight the associated technical challenges that emerge from automatically extracting facts from patents. We conclude by outlining potential future directions in this domain that could help drive biomedical research and drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Acoustic emission signal processing technique to characterize reactor in-pile phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Vivek, E-mail: vivek.agarwal@inl.gov; Tawfik, Magdy S., E-mail: magdy.tawfik@inl.gov; Smith, James A., E-mail: james.smith@inl.gov

    2015-03-31

    Existing and developing advanced sensor technologies and instrumentation will allow non-intrusive in-pile measurement of temperature, extension, and fission gases when coupled with advanced signal processing algorithms. The transmitted measured sensor signals from inside to the outside of containment structure are corrupted by noise and are attenuated, thereby reducing the signal strength and the signal-to-noise ratio. Identification and extraction of actual signal (representative of an in-pile phenomenon) is a challenging and complicated process. In the paper, empirical mode decomposition technique is utilized to reconstruct actual sensor signal by partially combining intrinsic mode functions. Reconstructed signal will correspond to phenomena and/or failuremore » modes occurring inside the reactor. In addition, it allows accurate non-intrusive monitoring and trending of in-pile phenomena.« less

  5. Hand-in-hand advances in biomedical engineering and sensorimotor restoration.

    PubMed

    Pisotta, Iolanda; Perruchoud, David; Ionta, Silvio

    2015-05-15

    Living in a multisensory world entails the continuous sensory processing of environmental information in order to enact appropriate motor routines. The interaction between our body and our brain is the crucial factor for achieving such sensorimotor integration ability. Several clinical conditions dramatically affect the constant body-brain exchange, but the latest developments in biomedical engineering provide promising solutions for overcoming this communication breakdown. The ultimate technological developments succeeded in transforming neuronal electrical activity into computational input for robotic devices, giving birth to the era of the so-called brain-machine interfaces. Combining rehabilitation robotics and experimental neuroscience the rise of brain-machine interfaces into clinical protocols provided the technological solution for bypassing the neural disconnection and restore sensorimotor function. Based on these advances, the recovery of sensorimotor functionality is progressively becoming a concrete reality. However, despite the success of several recent techniques, some open issues still need to be addressed. Typical interventions for sensorimotor deficits include pharmaceutical treatments and manual/robotic assistance in passive movements. These procedures achieve symptoms relief but their applicability to more severe disconnection pathologies is limited (e.g. spinal cord injury or amputation). Here we review how state-of-the-art solutions in biomedical engineering are continuously increasing expectances in sensorimotor rehabilitation, as well as the current challenges especially with regards to the translation of the signals from brain-machine interfaces into sensory feedback and the incorporation of brain-machine interfaces into daily activities. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. The National Center for Biomedical Ontology

    PubMed Central

    Noy, Natalya F; Shah, Nigam H; Whetzel, Patricia L; Chute, Christopher G; Story, Margaret-Anne; Smith, Barry

    2011-01-01

    The National Center for Biomedical Ontology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedical ontology and ontology-based technology and best practices; and collaborate with a variety of groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for Biomedical Ontology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. PMID:22081220

  7. Discovering and visualizing indirect associations between biomedical concepts

    PubMed Central

    Tsuruoka, Yoshimasa; Miwa, Makoto; Hamamoto, Kaisei; Tsujii, Jun'ichi; Ananiadou, Sophia

    2011-01-01

    Motivation: Discovering useful associations between biomedical concepts has been one of the main goals in biomedical text-mining, and understanding their biomedical contexts is crucial in the discovery process. Hence, we need a text-mining system that helps users explore various types of (possibly hidden) associations in an easy and comprehensible manner. Results: This article describes FACTA+, a real-time text-mining system for finding and visualizing indirect associations between biomedical concepts from MEDLINE abstracts. The system can be used as a text search engine like PubMed with additional features to help users discover and visualize indirect associations between important biomedical concepts such as genes, diseases and chemical compounds. FACTA+ inherits all functionality from its predecessor, FACTA, and extends it by incorporating three new features: (i) detecting biomolecular events in text using a machine learning model, (ii) discovering hidden associations using co-occurrence statistics between concepts, and (iii) visualizing associations to improve the interpretability of the output. To the best of our knowledge, FACTA+ is the first real-time web application that offers the functionality of finding concepts involving biomolecular events and visualizing indirect associations of concepts with both their categories and importance. Availability: FACTA+ is available as a web application at http://refine1-nactem.mc.man.ac.uk/facta/, and its visualizer is available at http://refine1-nactem.mc.man.ac.uk/facta-visualizer/. Contact: tsuruoka@jaist.ac.jp PMID:21685059

  8. Signal Processing in Periodically Forced Gradient Frequency Neural Networks

    PubMed Central

    Kim, Ji Chul; Large, Edward W.

    2015-01-01

    Oscillatory instability at the Hopf bifurcation is a dynamical phenomenon that has been suggested to characterize active non-linear processes observed in the auditory system. Networks of oscillators poised near Hopf bifurcation points and tuned to tonotopically distributed frequencies have been used as models of auditory processing at various levels, but systematic investigation of the dynamical properties of such oscillatory networks is still lacking. Here we provide a dynamical systems analysis of a canonical model for gradient frequency neural networks driven by a periodic signal. We use linear stability analysis to identify various driven behaviors of canonical oscillators for all possible ranges of model and forcing parameters. The analysis shows that canonical oscillators exhibit qualitatively different sets of driven states and transitions for different regimes of model parameters. We classify the parameter regimes into four main categories based on their distinct signal processing capabilities. This analysis will lead to deeper understanding of the diverse behaviors of neural systems under periodic forcing and can inform the design of oscillatory network models of auditory signal processing. PMID:26733858

  9. Rapid prototyping of multi-scale biomedical microdevices by combining additive manufacturing technologies.

    PubMed

    Hengsbach, Stefan; Lantada, Andrés Díaz

    2014-08-01

    The possibility of designing and manufacturing biomedical microdevices with multiple length-scale geometries can help to promote special interactions both with their environment and with surrounding biological systems. These interactions aim to enhance biocompatibility and overall performance by using biomimetic approaches. In this paper, we present a design and manufacturing procedure for obtaining multi-scale biomedical microsystems based on the combination of two additive manufacturing processes: a conventional laser writer to manufacture the overall device structure, and a direct-laser writer based on two-photon polymerization to yield finer details. The process excels for its versatility, accuracy and manufacturing speed and allows for the manufacture of microsystems and implants with overall sizes up to several millimeters and with details down to sub-micrometric structures. As an application example we have focused on manufacturing a biomedical microsystem to analyze the impact of microtextured surfaces on cell motility. This process yielded a relevant increase in precision and manufacturing speed when compared with more conventional rapid prototyping procedures.

  10. Biomedical Terminology Mapper for UML projects.

    PubMed

    Thibault, Julien C; Frey, Lewis

    2013-01-01

    As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies.

  11. Biomedical Terminology Mapper for UML projects

    PubMed Central

    Thibault, Julien C.; Frey, Lewis

    As the biomedical community collects and generates more and more data, the need to describe these datasets for exchange and interoperability becomes crucial. This paper presents a mapping algorithm that can help developers expose local implementations described with UML through standard terminologies. The input UML class or attribute name is first normalized and tokenized, then lookups in a UMLS-based dictionary are performed. For the evaluation of the algorithm 142 UML projects were extracted from caGrid and automatically mapped to National Cancer Institute (NCI) terminology concepts. Resulting mappings at the UML class and attribute levels were compared to the manually curated annotations provided in caGrid. Results are promising and show that this type of algorithm could speed-up the tedious process of mapping local implementations to standard biomedical terminologies. PMID:24303278

  12. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts.

    PubMed

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement.

  13. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts

    PubMed Central

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J.; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement. PMID:29854223

  14. Modern Techniques in Acoustical Signal and Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V

    2002-04-04

    Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less

  15. Compound image segmentation of published biomedical figures.

    PubMed

    Li, Pengyuan; Jiang, Xiangying; Kambhamettu, Chandra; Shatkay, Hagit

    2018-04-01

    Images convey essential information in biomedical publications. As such, there is a growing interest within the bio-curation and the bio-databases communities, to store images within publications as evidence for biomedical processes and for experimental results. However, many of the images in biomedical publications are compound images consisting of multiple panels, where each individual panel potentially conveys a different type of information. Segmenting such images into constituent panels is an essential first step toward utilizing images. In this article, we develop a new compound image segmentation system, FigSplit, which is based on Connected Component Analysis. To overcome shortcomings typically manifested by existing methods, we develop a quality assessment step for evaluating and modifying segmentations. Two methods are proposed to re-segment the images if the initial segmentation is inaccurate. Experimental results show the effectiveness of our method compared with other methods. The system is publicly available for use at: https://www.eecis.udel.edu/~compbio/FigSplit. The code is available upon request. shatkay@udel.edu. Supplementary data are available online at Bioinformatics.

  16. Branding the bio/biomedical engineering degree.

    PubMed

    Voigt, Herbert F

    2011-01-01

    The future challenges to medical and biological engineering, sometimes referred to as biomedical engineering or simply bioengineering, are many. Some of these are identifiable now and others will emerge from time to time as new technologies are introduced and harnessed. There is a fundamental issue regarding "Branding the bio/biomedical engineering degree" that requires a common understanding of what is meant by a B.S. degree in Biomedical Engineering, Bioengineering, or Biological Engineering. In this paper we address some of the issues involved in branding the Bio/Biomedical Engineering degree, with the aim of clarifying the Bio/Biomedical Engineering brand.

  17. Radar signal pre-processing to suppress surface bounce and multipath

    DOEpatents

    Paglieroni, David W; Mast, Jeffrey E; Beer, N. Reginald

    2013-12-31

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes that return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  18. Biomedical research

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Biomedical problems encountered by man in space which have been identified as a result of previous experience in simulated or actual spaceflight include cardiovascular deconditioning, motion sickness, bone loss, muscle atrophy, red cell alterations, fluid and electrolyte loss, radiation effects, radiation protection, behavior, and performance. The investigations and the findings in each of these areas were reviewed. A description of how biomedical research is organized within NASA, how it is funded, and how it is being reoriented to meet the needs of future manned space missions is also provided.

  19. A new algorithm for epilepsy seizure onset detection and spread estimation from EEG signals

    NASA Astrophysics Data System (ADS)

    Quintero-Rincón, Antonio; Pereyra, Marcelo; D'Giano, Carlos; Batatia, Hadj; Risk, Marcelo

    2016-04-01

    Appropriate diagnosis and treatment of epilepsy is a main public health issue. Patients suffering from this disease often exhibit different physical characterizations, which result from the synchronous and excessive discharge of a group of neurons in the cerebral cortex. Extracting this information using EEG signals is an important problem in biomedical signal processing. In this work we propose a new algorithm for seizure onset detection and spread estimation in epilepsy patients. The algorithm is based on a multilevel 1-D wavelet decomposition that captures the physiological brain frequency signals coupled with a generalized gaussian model. Preliminary experiments with signals from 30 epilepsy crisis and 11 subjects, suggest that the proposed methodology is a powerful tool for detecting the onset of epilepsy seizures with his spread across the brain.

  20. Calcium Signals: The Lead Currency of Plant Information Processing

    PubMed Central

    Kudla, Jörg; Batistič, Oliver; Hashimoto, Kenji

    2010-01-01

    Ca2+ signals are core transducers and regulators in many adaptation and developmental processes of plants. Ca2+ signals are represented by stimulus-specific signatures that result from the concerted action of channels, pumps, and carriers that shape temporally and spatially defined Ca2+ elevations. Cellular Ca2+ signals are decoded and transmitted by a toolkit of Ca2+ binding proteins that relay this information into downstream responses. Major transduction routes of Ca2+ signaling involve Ca2+-regulated kinases mediating phosphorylation events that orchestrate downstream responses or comprise regulation of gene expression via Ca2+-regulated transcription factors and Ca2+-responsive promoter elements. Here, we review some of the remarkable progress that has been made in recent years, especially in identifying critical components functioning in Ca2+ signal transduction, both at the single-cell and multicellular level. Despite impressive progress in our understanding of the processing of Ca2+ signals during the past years, the elucidation of the exact mechanistic principles that underlie the specific recognition and conversion of the cellular Ca2+ currency into defined changes in protein–protein interaction, protein phosphorylation, and gene expression and thereby establish the specificity in stimulus response coupling remain to be explored. PMID:20354197

  1. Unsupervised discovery of information structure in biomedical documents.

    PubMed

    Kiela, Douwe; Guo, Yufan; Stenius, Ulla; Korhonen, Anna

    2015-04-01

    Information structure (IS) analysis is a text mining technique, which classifies text in biomedical articles into categories that capture different types of information, such as objectives, methods, results and conclusions of research. It is a highly useful technique that can support a range of Biomedical Text Mining tasks and can help readers of biomedical literature find information of interest faster, accelerating the highly time-consuming process of literature review. Several approaches to IS analysis have been presented in the past, with promising results in real-world biomedical tasks. However, all existing approaches, even weakly supervised ones, require several hundreds of hand-annotated training sentences specific to the domain in question. Because biomedicine is subject to considerable domain variation, such annotations are expensive to obtain. This makes the application of IS analysis across biomedical domains difficult. In this article, we investigate an unsupervised approach to IS analysis and evaluate the performance of several unsupervised methods on a large corpus of biomedical abstracts collected from PubMed. Our best unsupervised algorithm (multilevel-weighted graph clustering algorithm) performs very well on the task, obtaining over 0.70 F scores for most IS categories when applied to well-known IS schemes. This level of performance is close to that of lightly supervised IS methods and has proven sufficient to aid a range of practical tasks. Thus, using an unsupervised approach, IS could be applied to support a wide range of tasks across sub-domains of biomedicine. We also demonstrate that unsupervised learning brings novel insights into IS of biomedical literature and discovers information categories that are not present in any of the existing IS schemes. The annotated corpus and software are available at http://www.cl.cam.ac.uk/∼dk427/bio14info.html. © The Author 2014. Published by Oxford University Press. All rights reserved. For

  2. A self-regulating biomolecular comparator for processing oscillatory signals

    PubMed Central

    Agrawal, Deepak K.; Franco, Elisa; Schulman, Rebecca

    2015-01-01

    While many cellular processes are driven by biomolecular oscillators, precise control of a downstream on/off process by a biochemical oscillator signal can be difficult: over an oscillator's period, its output signal varies continuously between its amplitude limits and spends a significant fraction of the time at intermediate values between these limits. Further, the oscillator's output is often noisy, with particularly large variations in the amplitude. In electronic systems, an oscillating signal is generally processed by a downstream device such as a comparator that converts a potentially noisy oscillatory input into a square wave output that is predominantly in one of two well-defined on and off states. The comparator's output then controls downstream processes. We describe a method for constructing a synthetic biochemical device that likewise produces a square-wave-type biomolecular output for a variety of oscillatory inputs. The method relies on a separation of time scales between the slow rate of production of an oscillatory signal molecule and the fast rates of intermolecular binding and conformational changes. We show how to control the characteristics of the output by varying the concentrations of the species and the reaction rates. We then use this control to show how our approach could be applied to process different in vitro and in vivo biomolecular oscillators, including the p53-Mdm2 transcriptional oscillator and two types of in vitro transcriptional oscillators. These results demonstrate how modular biomolecular circuits could, in principle, be combined to build complex dynamical systems. The simplicity of our approach also suggests that natural molecular circuits may process some biomolecular oscillator outputs before they are applied downstream. PMID:26378119

  3. Data processing method for a weak, moving telemetry signal

    NASA Technical Reports Server (NTRS)

    Kendall, W. B.; Levy, G. S.; Nixon, D. L.; Panson, P. L.

    1969-01-01

    Method of processing data from a spacecraft, where the carrier has a low signal-to-noise ratio and wide unpredictable frequency shifts, consists of analogue recording of the noisy signal along with a high-frequency tone that is used as a clock to trigger a digitizer.

  4. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  5. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  6. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  7. Timeseries Signal Processing for Enhancing Mobile Surveys: Learning from Field Studies

    NASA Astrophysics Data System (ADS)

    Risk, D. A.; Lavoie, M.; Marshall, A. D.; Baillie, J.; Atherton, E. E.; Laybolt, W. D.

    2015-12-01

    Vehicle-based surveys using laser and other analyzers are now commonplace in research and industry. In many cases when these studies target biologically-relevant gases like methane and carbon dioxide, the minimum detection limits are often coarse (ppm) relative to the analyzer's capabilities (ppb), because of the inherent variability in the ambient background concentrations across the landscape that creates noise and uncertainty. This variation arises from localized biological sinks and sources, but also atmospheric turbulence, air pooling, and other factors. Computational processing routines are widely used in many fields to increase resolution of a target signal in temporally dense data, and offer promise for enhancing mobile surveying techniques. Signal processing routines can both help identify anomalies at very low levels, or can be used inversely to remove localized industrially-emitted anomalies from ecological data. This presentation integrates learnings from various studies in which simple signal processing routines were used successfully to isolate different temporally-varying components of 1 Hz timeseries measured with laser- and UV fluorescence-based analyzers. As illustrative datasets, we present results from industrial fugitive emission studies from across Canada's western provinces and other locations, and also an ecological study that aimed to model near-surface concentration variability across different biomes within eastern Canada. In these cases, signal processing algorithms contributed significantly to the clarity of both industrial, and ecological processes. In some instances, signal processing was too computationally intensive for real-time in-vehicle processing, but we identified workarounds for analyzer-embedded software that contributed to an improvement in real-time resolution of small anomalies. Signal processing is a natural accompaniment to these datasets, and many avenues are open to researchers who wish to enhance existing, and future

  8. Efficient audio signal processing for embedded systems

    NASA Astrophysics Data System (ADS)

    Chiu, Leung Kin

    As mobile platforms continue to pack on more computational power, electronics manufacturers start to differentiate their products by enhancing the audio features. However, consumers also demand smaller devices that could operate for longer time, hence imposing design constraints. In this research, we investigate two design strategies that would allow us to efficiently process audio signals on embedded systems such as mobile phones and portable electronics. In the first strategy, we exploit properties of the human auditory system to process audio signals. We designed a sound enhancement algorithm to make piezoelectric loudspeakers sound ”richer" and "fuller." Piezoelectric speakers have a small form factor but exhibit poor response in the low-frequency region. In the algorithm, we combine psychoacoustic bass extension and dynamic range compression to improve the perceived bass coming out from the tiny speakers. We also developed an audio energy reduction algorithm for loudspeaker power management. The perceptually transparent algorithm extends the battery life of mobile devices and prevents thermal damage in speakers. This method is similar to audio compression algorithms, which encode audio signals in such a ways that the compression artifacts are not easily perceivable. Instead of reducing the storage space, however, we suppress the audio contents that are below the hearing threshold, therefore reducing the signal energy. In the second strategy, we use low-power analog circuits to process the signal before digitizing it. We designed an analog front-end for sound detection and implemented it on a field programmable analog array (FPAA). The system is an example of an analog-to-information converter. The sound classifier front-end can be used in a wide range of applications because programmable floating-gate transistors are employed to store classifier weights. Moreover, we incorporated a feature selection algorithm to simplify the analog front-end. A machine

  9. Carbon nanotubes: engineering biomedical applications.

    PubMed

    Gomez-Gualdrón, Diego A; Burgos, Juan C; Yu, Jiamei; Balbuena, Perla B

    2011-01-01

    Carbon nanotubes (CNTs) are cylinder-shaped allotropic forms of carbon, most widely produced under chemical vapor deposition. They possess astounding chemical, electronic, mechanical, and optical properties. Being among the most promising materials in nanotechnology, they are also likely to revolutionize medicine. Among other biomedical applications, after proper functionalization carbon nanotubes can be transformed into sophisticated biosensing and biocompatible drug-delivery systems, for specific targeting and elimination of tumor cells. This chapter provides an introduction to the chemical and electronic structure and properties of single-walled carbon nanotubes, followed by a description of the main synthesis and post-synthesis methods. These sections allow the reader to become familiar with the specific characteristics of these materials and the manner in which these properties may be dependent on the specific synthesis and post-synthesis processes. The chapter ends with a review of the current biomedical applications of carbon nanotubes, highlighting successes and challenges. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Biomedical information retrieval across languages.

    PubMed

    Daumke, Philipp; Markü, Kornél; Poprat, Michael; Schulz, Stefan; Klar, Rüdiger

    2007-06-01

    This work presents a new dictionary-based approach to biomedical cross-language information retrieval (CLIR) that addresses many of the general and domain-specific challenges in current CLIR research. Our method is based on a multilingual lexicon that was generated partly manually and partly automatically, and currently covers six European languages. It contains morphologically meaningful word fragments, termed subwords. Using subwords instead of entire words significantly reduces the number of lexical entries necessary to sufficiently cover a specific language and domain. Mediation between queries and documents is based on these subwords as well as on lists of word-n-grams that are generated from large monolingual corpora and constitute possible translation units. The translations are then sent to a standard Internet search engine. This process makes our approach an effective tool for searching the biomedical content of the World Wide Web in different languages. We evaluate this approach using the OHSUMED corpus, a large medical document collection, within a cross-language retrieval setting.

  11. A neural joint model for entity and relation extraction from biomedical text.

    PubMed

    Li, Fei; Zhang, Meishan; Fu, Guohong; Ji, Donghong

    2017-03-31

    Extracting biomedical entities and their relations from text has important applications on biomedical research. Previous work primarily utilized feature-based pipeline models to process this task. Many efforts need to be made on feature engineering when feature-based models are employed. Moreover, pipeline models may suffer error propagation and are not able to utilize the interactions between subtasks. Therefore, we propose a neural joint model to extract biomedical entities as well as their relations simultaneously, and it can alleviate the problems above. Our model was evaluated on two tasks, i.e., the task of extracting adverse drug events between drug and disease entities, and the task of extracting resident relations between bacteria and location entities. Compared with the state-of-the-art systems in these tasks, our model improved the F1 scores of the first task by 5.1% in entity recognition and 8.0% in relation extraction, and that of the second task by 9.2% in relation extraction. The proposed model achieves competitive performances with less work on feature engineering. We demonstrate that the model based on neural networks is effective for biomedical entity and relation extraction. In addition, parameter sharing is an alternative method for neural models to jointly process this task. Our work can facilitate the research on biomedical text mining.

  12. Application of homomorphic signal processing to stress wave factor analysis

    NASA Technical Reports Server (NTRS)

    Karagulle, H.; Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    The stress wave factor (SWF) signal, which is the output of an ultrasonic testing system where the transmitting and receiving transducers are coupled to the same face of the test structure, is analyzed in the frequency domain. The SWF signal generated in an isotropic elastic plate is modelled as the superposition of successive reflections. The reflection which is generated by the stress waves which travel p times as a longitudinal (P) wave and s times as a shear (S) wave through the plate while reflecting back and forth between the bottom and top faces of the plate is designated as the reflection with p, s. Short-time portions of the SWF signal are considered for obtaining spectral information on individual reflections. If the significant reflections are not overlapped, the short-time Fourier analysis is used. A summary of the elevant points of homomorphic signal processing, which is also called cepstrum analysis, is given. Homomorphic signal processing is applied to short-time SWF signals to obtain estimates of the log spectra of individual reflections for cases in which the reflections are overlapped. Two typical SWF signals generated in aluminum plates (overlapping and non-overlapping reflections) are analyzed.

  13. Implantable Biomedical Microsystems: A New Graduate Course in Biomedical Circuits and Systems

    ERIC Educational Resources Information Center

    Sodagar, Amir M.

    2014-01-01

    After more than two decades of research on the design and development of implantable biomedical microsystems, it is time now to organize research achievements in this area in a consolidated and pedagogical form. This paper introduces a new graduate course in advanced biomedical circuits and systems. Designed for graduate students with electrical…

  14. Physics-based signal processing algorithms for micromachined cantilever arrays

    DOEpatents

    Candy, James V; Clague, David S; Lee, Christopher L; Rudd, Robert E; Burnham, Alan K; Tringe, Joseph W

    2013-11-19

    A method of using physics-based signal processing algorithms for micromachined cantilever arrays. The methods utilize deflection of a micromachined cantilever that represents the chemical, biological, or physical element being detected. One embodiment of the method comprises the steps of modeling the deflection of the micromachined cantilever producing a deflection model, sensing the deflection of the micromachined cantilever and producing a signal representing the deflection, and comparing the signal representing the deflection with the deflection model.

  15. Electrophysiology for biomedical engineering students: a practical and theoretical course in animal electrocorticography.

    PubMed

    Albarracín, Ana L; Farfán, Fernando D; Coletti, Marcos A; Teruya, Pablo Y; Felice, Carmelo J

    2016-09-01

    The major challenge in laboratory teaching is the application of abstract concepts in simple and direct practical lessons. However, students rarely have the opportunity to participate in a laboratory that combines practical learning with a realistic research experience. In the Biomedical Engineering career, we offer short and optional courses to complement studies for students as they initiate their Graduation Project. The objective of these theoretical and practical courses is to introduce students to the topics of their projects. The present work describes an experience in electrophysiology to teach undergraduate students how to extract cortical information using electrocorticographic techniques. Students actively participate in some parts of the experience and then process and analyze the data obtained with different signal processing tools. In postlaboratory evaluations, students described the course as an exceptional opportunity for students interested in following a postgraduate science program and fully appreciated their contents. Copyright © 2016 The American Physiological Society.

  16. Low-pass parabolic FFT filter for airborne and satellite lidar signal processing.

    PubMed

    Jiao, Zhongke; Liu, Bo; Liu, Enhai; Yue, Yongjian

    2015-10-14

    In order to reduce random errors of the lidar signal inversion, a low-pass parabolic fast Fourier transform filter (PFFTF) was introduced for noise elimination. A compact airborne Raman lidar system was studied, which applied PFFTF to process lidar signals. Mathematics and simulations of PFFTF along with low pass filters, sliding mean filter (SMF), median filter (MF), empirical mode decomposition (EMD) and wavelet transform (WT) were studied, and the practical engineering value of PFFTF for lidar signal processing has been verified. The method has been tested on real lidar signal from Wyoming Cloud Lidar (WCL). Results show that PFFTF has advantages over the other methods. It keeps the high frequency components well and reduces much of the random noise simultaneously for lidar signal processing.

  17. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    PubMed Central

    Vukićević, Milan

    2014-01-01

    Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101

  18. Signal processing of bedload transport impact amplitudes on accelerometer instrumented plates

    USDA-ARS?s Scientific Manuscript database

    This work was performed to help establish a data processing methodology for relating accelerometer signals caused by impacts of gravel on steel plates to the mass and size of the transported material. Signal processing was performed on impact plate data collected in flume experiments at the Nationa...

  19. [Biomedical waste management in five hospitals in Dakar, Senegal].

    PubMed

    Ndiaye, M; El Metghari, L; Soumah, M M; Sow, M L

    2012-10-01

    Biomedical waste is currently a real health and environmental concern. In this regard, a study was conducted in 5 hospitals in Dakar to review their management of biomedical waste and to formulate recommendations. This is a descriptive cross-sectional study conducted from 1 April to 31 July 2010 in five major hospitals of Dakar. A questionnaire administered to hospital managers, heads of departments, residents and heads of hospital hygiene departments as well as interviews conducted with healthcare personnel and operators of waste incinerators made it possible to assess mechanisms and knowledge on biomedical waste management. Content analysis of interviews, observations and a data sheet allowed processing the data thus gathered. Of the 150 questionnaires distributed, 98 responses were obtained representing a response rate of 65.3%. An interview was conducted with 75 employees directly involved in the management of biomedical waste and observations were made on biomedical waste management in 86 hospital services. Sharps as well as blood and liquid waste were found in all services except in pharmacies, pharmaceutical waste in 66 services, infectious waste in 49 services and anatomical waste in 11 services. Sorting of biomedical waste was ill-adapted in 53.5% (N = 46) of services and the use of the colour-coding system effective in 31.4% (N = 27) of services. Containers for the safe disposal of sharps were available in 82.5% (N = 71) of services and were effectively utilized in 51.1% (N = 44) of these services. In most services, an illadapted packaging was observed with the use of plastic bottles and bins for waste collection and overfilled containers. With the exception of Hôpital Principal, the main storage area was in open air, unsecured, with biomedical waste littered on the floor and often mixed with waste similar to household refuse. The transfer of biomedical waste to the main storage area was done using trolleys or carts in 67.4% (N = 58) of services and

  20. The Ensemble Kalman filter: a signal processing perspective

    NASA Astrophysics Data System (ADS)

    Roth, Michael; Hendeby, Gustaf; Fritsche, Carsten; Gustafsson, Fredrik

    2017-12-01

    The ensemble Kalman filter (EnKF) is a Monte Carlo-based implementation of the Kalman filter (KF) for extremely high-dimensional, possibly nonlinear, and non-Gaussian state estimation problems. Its ability to handle state dimensions in the order of millions has made the EnKF a popular algorithm in different geoscientific disciplines. Despite a similarly vital need for scalable algorithms in signal processing, e.g., to make sense of the ever increasing amount of sensor data, the EnKF is hardly discussed in our field. This self-contained review is aimed at signal processing researchers and provides all the knowledge to get started with the EnKF. The algorithm is derived in a KF framework, without the often encountered geoscientific terminology. Algorithmic challenges and required extensions of the EnKF are provided, as well as relations to sigma point KF and particle filters. The relevant EnKF literature is summarized in an extensive survey and unique simulation examples, including popular benchmark problems, complement the theory with practical insights. The signal processing perspective highlights new directions of research and facilitates the exchange of potentially beneficial ideas, both for the EnKF and high-dimensional nonlinear and non-Gaussian filtering in general.

  1. Emerging applications of nanoparticles: Biomedical and environmental

    NASA Astrophysics Data System (ADS)

    Gulati, Shivani; Sachdeva, M.; Bhasin, K. K.

    2018-05-01

    Nanotechnology finds a wide range of applications from energy production to industrial fabrication processes to biomedical applications. Nanoparticles (NPs) can be engineered to possess unique compositions and functionalities to empower novel tools and techniques that have not existed previously in biomedical research. The unique size and shape dependent physicochemical properties along with their unique spectral and optical properties have prompted the development of a wide variety of potential applications in the field of diagnostics and medicines. In the plethora of scientific and technological fields, environmental safety is also a big concern. For this purpose, nanomaterials have been functionalized to cope up the existing pollution, improving manufacturing methods to reduce the generation of new pollution, and making alternative and more cost effective energy sources.

  2. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  3. Tunable signal processing in synthetic MAP kinase cascades.

    PubMed

    O'Shaughnessy, Ellen C; Palani, Santhosh; Collins, James J; Sarkar, Casim A

    2011-01-07

    The flexibility of MAPK cascade responses enables regulation of a vast array of cell fate decisions, but elucidating the mechanisms underlying this plasticity is difficult in endogenous signaling networks. We constructed insulated mammalian MAPK cascades in yeast to explore how intrinsic and extrinsic perturbations affect the flexibility of these synthetic signaling modules. Contrary to biphasic dependence on scaffold concentration, we observe monotonic decreases in signal strength as scaffold concentration increases. We find that augmenting the concentration of sequential kinases can enhance ultrasensitivity and lower the activation threshold. Further, integrating negative regulation and concentration variation can decouple ultrasensitivity and threshold from the strength of the response. Computational analyses show that cascading can generate ultrasensitivity and that natural cascades with different kinase concentrations are innately biased toward their distinct activation profiles. This work demonstrates that tunable signal processing is inherent to minimal MAPK modules and elucidates principles for rational design of synthetic signaling systems. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Missile signal processing common computer architecture for rapid technology upgrade

    NASA Astrophysics Data System (ADS)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application

  5. Evaluation of research in biomedical ontologies

    PubMed Central

    Dumontier, Michel; Gkoutos, Georgios V.

    2013-01-01

    Ontologies are now pervasive in biomedicine, where they serve as a means to standardize terminology, to enable access to domain knowledge, to verify data consistency and to facilitate integrative analyses over heterogeneous biomedical data. For this purpose, research on biomedical ontologies applies theories and methods from diverse disciplines such as information management, knowledge representation, cognitive science, linguistics and philosophy. Depending on the desired applications in which ontologies are being applied, the evaluation of research in biomedical ontologies must follow different strategies. Here, we provide a classification of research problems in which ontologies are being applied, focusing on the use of ontologies in basic and translational research, and we demonstrate how research results in biomedical ontologies can be evaluated. The evaluation strategies depend on the desired application and measure the success of using an ontology for a particular biomedical problem. For many applications, the success can be quantified, thereby facilitating the objective evaluation and comparison of research in biomedical ontology. The objective, quantifiable comparison of research results based on scientific applications opens up the possibility for systematically improving the utility of ontologies in biomedical research. PMID:22962340

  6. National Space Biomedical Research Institute

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The National Space Biomedical Research Institute (NSBRI) sponsors and performs fundamental and applied space biomedical research with the mission of leading a world-class, national effort in integrated, critical path space biomedical research that supports NASA's Human Exploration and Development of Space (HEDS) Strategic Plan. It focuses on the enabling of long-term human presence in, development of, and exploration of space. This will be accomplished by: designing, implementing, and validating effective countermeasures to address the biological and environmental impediments to long-term human space flight; defining the molecular, cellular, organ-level, integrated responses and mechanistic relationships that ultimately determine these impediments, where such activity fosters the development of novel countermeasures; establishing biomedical support technologies to maximize human performance in space, reduce biomedical hazards to an acceptable level, and deliver quality medical care; transferring and disseminating the biomedical advances in knowledge and technology acquired through living and working in space to the benefit of mankind in space and on Earth, including the treatment of patients suffering from gravity- and radiation-related conditions on Earth; and ensuring open involvement of the scientific community, industry, and the public at large in the Institute's activities and fostering a robust collaboration with NASA, particularly through Johnson Space Center.

  7. BIOMedical Search Engine Framework: Lightweight and customized implementation of domain-specific biomedical search engines.

    PubMed

    Jácome, Alberto G; Fdez-Riverola, Florentino; Lourenço, Anália

    2016-07-01

    Text mining and semantic analysis approaches can be applied to the construction of biomedical domain-specific search engines and provide an attractive alternative to create personalized and enhanced search experiences. Therefore, this work introduces the new open-source BIOMedical Search Engine Framework for the fast and lightweight development of domain-specific search engines. The rationale behind this framework is to incorporate core features typically available in search engine frameworks with flexible and extensible technologies to retrieve biomedical documents, annotate meaningful domain concepts, and develop highly customized Web search interfaces. The BIOMedical Search Engine Framework integrates taggers for major biomedical concepts, such as diseases, drugs, genes, proteins, compounds and organisms, and enables the use of domain-specific controlled vocabulary. Technologies from the Typesafe Reactive Platform, the AngularJS JavaScript framework and the Bootstrap HTML/CSS framework support the customization of the domain-oriented search application. Moreover, the RESTful API of the BIOMedical Search Engine Framework allows the integration of the search engine into existing systems or a complete web interface personalization. The construction of the Smart Drug Search is described as proof-of-concept of the BIOMedical Search Engine Framework. This public search engine catalogs scientific literature about antimicrobial resistance, microbial virulence and topics alike. The keyword-based queries of the users are transformed into concepts and search results are presented and ranked accordingly. The semantic graph view portraits all the concepts found in the results, and the researcher may look into the relevance of different concepts, the strength of direct relations, and non-trivial, indirect relations. The number of occurrences of the concept shows its importance to the query, and the frequency of concept co-occurrence is indicative of biological relations

  8. Transient high frequency signal estimation: A model-based processing approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, F.L.

    1985-03-22

    By utilizing the superposition property of linear systems a method of estimating the incident signal from reflective nondispersive data is developed. One of the basic merits of this approach is that, the reflections were removed by direct application of a Weiner type estimation algorithm, after the appropriate input was synthesized. The structure of the nondispersive signal model is well documented, and thus its' credence is established. The model is stated and more effort is devoted to practical methods of estimating the model parameters. Though a general approach was developed for obtaining the reflection weights, a simpler approach was employed here,more » since a fairly good reflection model is available. The technique essentially consists of calculating ratios of the autocorrelation function at lag zero and that lag where the incident and first reflection coincide. We initially performed our processing procedure on a measurement of a single signal. Multiple application of the processing procedure was required when we applied the reflection removal technique on a measurement containing information from the interaction of two physical phenomena. All processing was performed using SIG, an interactive signal processing package. One of the many consequences of using SIG was that repetitive operations were, for the most part, automated. A custom menu was designed to perform the deconvolution process.« less

  9. Preface to the special issue on "Integrated Microwave Photonic Signal Processing"

    NASA Astrophysics Data System (ADS)

    Azaña, José; Yao, Jianping

    2016-08-01

    As Guest Editors, we are pleased to introduce this special issue on ;Integrated Microwave Photonic Signal Processing; published by the Elsevier journal Optics Communications. Microwave photonics is a field of growing importance from both scientific and practical application perspectives. The field of microwave photonics is devoted to the study, development and application of optics-based techniques and technologies aimed to the generation, processing, control, characterization and/or distribution of microwave signals, including signals well into the millimeter-wave frequency range. The use of photonic technologies for these microwave applications translates into a number of key advantages, such as the possibility of dealing with high-frequency, wide bandwidth signals with minimal losses and reduced electromagnetic interferences, and the potential for enhanced reconfigurability. The central purpose of this special issue is to provide an overview of the state of the art of generation, processing and characterization technologies for high-frequency microwave signals. It is now widely accepted that the practical success of microwave photonics at a large scale will essentially depend on the realization of high-performance microwave-photonic signal-processing engines in compact and integrated formats, preferably on a chip. Thus, the focus of the issue is on techniques implemented using integrated photonic technologies, with the goal of providing an update of the most recent advances toward realization of this vision.

  10. Signal processing for the profoundly deaf.

    PubMed

    Boothyroyd, A

    1990-01-01

    Profound deafness, defined here as a hearing loss in excess of 90 dB, is characterized by high thresholds, reduced hearing range in the intensity and frequency domains, and poor resolution in the frequency and time domains. The high thresholds call for hearing aids with unusually high gains or remote microphones that can be placed close to the signal source. The former option creates acoustic feedback problems for which digital signal processing may yet offer solutions. The latter option calls for carrier wave technology that is already available. The reduced frequency and intensity ranges would appear to call for frequency and/or amplitude compression. It might also be argued, however, that any attempts to compress the acoustic signal into the limited hearing range of the profoundly deaf will be counterproductive because of poor frequency and time resolution, especially when the signal is present in noise. In experiments with a 2-channel compression system, only 1 of 9 subjects showed an improvement of perception with the introduction of fast-release (20 ms) compression. The other 8 experienced no benefit or a slight deterioration of performance. These results support the concept of providing the profoundly deaf with simpler, rather than more complex, patterns, perhaps through the use of feature extraction hearing aids. Data from users of cochlear implants already employing feature extraction techniques also support this concept.

  11. Optimal and adaptive methods of processing hydroacoustic signals (review)

    NASA Astrophysics Data System (ADS)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  12. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  13. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  14. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  15. The physics of bat echolocation: Signal processing techniques

    NASA Astrophysics Data System (ADS)

    Denny, Mark

    2004-12-01

    The physical principles and signal processing techniques underlying bat echolocation are investigated. It is shown, by calculation and simulation, how the measured echolocation performance of bats can be achieved.

  16. Interoceptive signals impact visual processing: Cardiac modulation of visual body perception.

    PubMed

    Ronchi, Roberta; Bernasconi, Fosco; Pfeiffer, Christian; Bello-Ruiz, Javier; Kaliuzhna, Mariia; Blanke, Olaf

    2017-09-01

    Multisensory perception research has largely focused on exteroceptive signals, but recent evidence has revealed the integration of interoceptive signals with exteroceptive information. Such research revealed that heartbeat signals affect sensory (e.g., visual) processing: however, it is unknown how they impact the perception of body images. Here we linked our participants' heartbeat to visual stimuli and investigated the spatio-temporal brain dynamics of cardio-visual stimulation on the processing of human body images. We recorded visual evoked potentials with 64-channel electroencephalography while showing a body or a scrambled-body (control) that appeared at the frequency of the on-line recorded participants' heartbeat or not (not-synchronous, control). Extending earlier studies, we found a body-independent effect, with cardiac signals enhancing visual processing during two time periods (77-130 ms and 145-246 ms). Within the second (later) time-window we detected a second effect characterised by enhanced activity in parietal, temporo-occipital, inferior frontal, and right basal ganglia-insula regions, but only when non-scrambled body images were flashed synchronously with the heartbeat (208-224 ms). In conclusion, our results highlight the role of interoceptive information for the visual processing of human body pictures within a network integrating cardio-visual signals of relevance for perceptual and cognitive aspects of visual body processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Processing the Interspecies Quorum-sensing Signal Autoinducer-2 (AI-2)

    PubMed Central

    Marques, João C.; Lamosa, Pedro; Russell, Caitlin; Ventura, Rita; Maycock, Christopher; Semmelhack, Martin F.; Miller, Stephen T.; Xavier, Karina B.

    2011-01-01

    The molecule (S)-4,5-dihydroxy-2,3-pentanedione (DPD) is produced by many different species of bacteria and is the precursor of the signal molecule autoinducer-2 (AI-2). AI-2 mediates interspecies communication and facilitates regulation of bacterial behaviors such as biofilm formation and virulence. A variety of bacterial species have the ability to sequester and process the AI-2 present in their environment, thereby interfering with the cell-cell communication of other bacteria. This process involves the AI-2-regulated lsr operon, comprised of the Lsr transport system that facilitates uptake of the signal, a kinase that phosphorylates the signal to phospho-DPD (P-DPD), and enzymes (like LsrG) that are responsible for processing the phosphorylated signal. Because P-DPD is the intracellular inducer of the lsr operon, enzymes involved in P-DPD processing impact the levels of Lsr expression. Here we show that LsrG catalyzes isomerization of P-DPD into 3,4,4-trihydroxy-2-pentanone-5-phosphate. We present the crystal structure of LsrG, identify potential catalytic residues, and determine which of these residues affects P-DPD processing in vivo and in vitro. We also show that an lsrG deletion mutant accumulates at least 10 times more P-DPD than wild type cells. Consistent with this result, we find that the lsrG mutant has increased expression of the lsr operon and an altered profile of AI-2 accumulation and removal. Understanding of the biochemical mechanisms employed by bacteria to quench signaling of other species can be of great utility in the development of therapies to control bacterial behavior. PMID:21454635

  18. Processing the Interspecies Quorum-sensing Signal Autoinducer-2 (AI-2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Marques; P Lamosa; C Russell

    The molecule (S)-4,5-dihydroxy-2,3-pentanedione (DPD) is produced by many different species of bacteria and is the precursor of the signal molecule autoinducer-2 (AI-2). AI-2 mediates interspecies communication and facilitates regulation of bacterial behaviors such as biofilm formation and virulence. A variety of bacterial species have the ability to sequester and process the AI-2 present in their environment, thereby interfering with the cell-cell communication of other bacteria. This process involves the AI-2-regulated lsr operon, comprised of the Lsr transport system that facilitates uptake of the signal, a kinase that phosphorylates the signal to phospho-DPD (P-DPD), and enzymes (like LsrG) that are responsiblemore » for processing the phosphorylated signal. Because P-DPD is the intracellular inducer of the lsr operon, enzymes involved in P-DPD processing impact the levels of Lsr expression. Here we show that LsrG catalyzes isomerization of P-DPD into 3,4,4-trihydroxy-2-pentanone-5-phosphate. We present the crystal structure of LsrG, identify potential catalytic residues, and determine which of these residues affects P-DPD processing in vivo and in vitro. We also show that an lsrG deletion mutant accumulates at least 10 times more P-DPD than wild type cells. Consistent with this result, we find that the lsrG mutant has increased expression of the lsr operon and an altered profile of AI-2 accumulation and removal. Understanding of the biochemical mechanisms employed by bacteria to quench signaling of other species can be of great utility in the development of therapies to control bacterial behavior.« less

  19. Coactivation of response initiation processes with redundant signals.

    PubMed

    Maslovat, Dana; Hajj, Joëlle; Carlsen, Anthony N

    2018-05-14

    During reaction time (RT) tasks, participants respond faster to multiple stimuli from different modalities as compared to a single stimulus, a phenomenon known as the redundant signal effect (RSE). Explanations for this effect typically include coactivation arising from the multiple stimuli, which results in enhanced processing of one or more response production stages. The current study compared empirical RT data with the predictions of a model in which initiation-related activation arising from each stimulus is additive. Participants performed a simple wrist extension RT task following either a visual go-signal, an auditory go-signal, or both stimuli with the auditory stimulus delayed between 0 and 125 ms relative to the visual stimulus. Results showed statistical equivalence between the predictions of an additive initiation model and the observed RT data, providing novel evidence that the RSE can be explained via a coactivation of initiation-related processes. It is speculated that activation summation occurs at the thalamus, leading to the observed facilitation of response initiation. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Design of a dataway processor for a parallel image signal processing system

    NASA Astrophysics Data System (ADS)

    Nomura, Mitsuru; Fujii, Tetsuro; Ono, Sadayasu

    1995-04-01

    Recently, demands for high-speed signal processing have been increasing especially in the field of image data compression, computer graphics, and medical imaging. To achieve sufficient power for real-time image processing, we have been developing parallel signal-processing systems. This paper describes a communication processor called 'dataway processor' designed for a new scalable parallel signal-processing system. The processor has six high-speed communication links (Dataways), a data-packet routing controller, a RISC CORE, and a DMA controller. Each communication link operates at 8-bit parallel in a full duplex mode at 50 MHz. Moreover, data routing, DMA, and CORE operations are processed in parallel. Therefore, sufficient throughput is available for high-speed digital video signals. The processor is designed in a top- down fashion using a CAD system called 'PARTHENON.' The hardware is fabricated using 0.5-micrometers CMOS technology, and its hardware is about 200 K gates.

  1. A survey of quality assurance practices in biomedical open source software projects.

    PubMed

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort

  2. Trends in Biomedical Education.

    ERIC Educational Resources Information Center

    Peppas, Nicholas A.; Mallinson, Richard G.

    1982-01-01

    An analysis of trends in biomedical education within chemical education is presented. Data used for the analysis included: type/level of course, subjects taught, and textbook preferences. Results among others of the 1980 survey indicate that 28 out of 79 schools responding offer at least one course in biomedical engineering. (JN)

  3. Biomedical applications engineering tasks

    NASA Technical Reports Server (NTRS)

    Laenger, C. J., Sr.

    1976-01-01

    The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.

  4. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  5. UCMS - A new signal parameter measurement system using digital signal processing techniques. [User Constraint Measurement System

    NASA Technical Reports Server (NTRS)

    Choi, H. J.; Su, Y. T.

    1986-01-01

    The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.

  6. Generation and coherent detection of QPSK signal using a novel method of digital signal processing

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan; Hu, Bingliang; He, Zhen-An; Xie, Wenjia; Gao, Xiaohui

    2018-02-01

    We demonstrate an optical quadrature phase-shift keying (QPSK) signal transmitter and an optical receiver for demodulating optical QPSK signal with homodyne detection and digital signal processing (DSP). DSP on the homodyne detection scheme is employed without locking the phase of the local oscillator (LO). In this paper, we present an extracting one-dimensional array of down-sampling method for reducing unwanted samples of constellation diagram measurement. Such a novel scheme embodies the following major advantages over the other conventional optical QPSK signal detection methods. First, this homodyne detection scheme does not need strict requirement on LO in comparison with linear optical sampling, such as having a flat spectral density and phase over the spectral support of the source under test. Second, the LabVIEW software is directly used for recovering the QPSK signal constellation without employing complex DSP circuit. Third, this scheme is applicable to multilevel modulation formats such as M-ary PSK and quadrature amplitude modulation (QAM) or higher speed signals by making minor changes.

  7. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optical signal-processing systems based on anisotropic media

    NASA Astrophysics Data System (ADS)

    Kiyashko, B. V.

    1995-10-01

    Partially coherent optical systems for signal processing are considered. The transfer functions are formed in these systems by interference of polarised light transmitted by an anisotropic medium. It is shown that such systems can perform various integral transformations of both optical and electric signals, in particular, two-dimensional Fourier and Fresnel transformations, as well as spectral analysis of weak light sources. It is demonstrated that such systems have the highest luminosity and vibration immunity among the systems with interference formation of transfer functions. An experimental investigation is reported of the application of these systems in the processing of signals from a linear hydroacoustic antenna array, and in measurements of the optical spectrum and of the intrinsic noise.

  8. Frontiers in biomedical engineering and biotechnology.

    PubMed

    Liu, Feng; Goodarzi, Ali; Wang, Haifeng; Stasiak, Joanna; Sun, Jianbo; Zhou, Yu

    2014-01-01

    The 2nd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2013), held in Wuhan on 11–13 October 2013, is an annual conference that aims at providing an opportunity for international and national researchers and practitioners to present the most recent advances and future challenges in the fields of Biomedical Information, Biomedical Engineering and Biotechnology. The papers published by this issue are selected from this conference, which witnesses the frontier in the field of Biomedical Engineering and Biotechnology, which particularly has helped improving the level of clinical diagnosis in medical work.

  9. [Master course in biomedical engineering].

    PubMed

    Jobbágy, Akos; Benyó, Zoltán; Monos, Emil

    2009-11-22

    The Bologna Declaration aims at harmonizing the European higher education structure. In accordance with the Declaration, biomedical engineering will be offered as a master (MSc) course also in Hungary, from year 2009. Since 1995 biomedical engineering course has been held in cooperation of three universities: Semmelweis University, Budapest Veterinary University, and Budapest University of Technology and Economics. One of the latter's faculties, Faculty of Electrical Engineering and Informatics, has been responsible for the course. Students could start their biomedical engineering studies - usually in parallel with their first degree course - after they collected at least 180 ECTS credits. Consequently, the biomedical engineering course could have been considered as a master course even before the Bologna Declaration. Students had to collect 130 ECTS credits during the six-semester course. This is equivalent to four-semester full-time studies, because during the first three semesters the curriculum required to gain only one third of the usual ECTS credits. The paper gives a survey on the new biomedical engineering master course, briefly summing up also the subjects in the curriculum.

  10. Bio-functionalization of biomedical metals.

    PubMed

    Xiao, M; Chen, Y M; Biao, M N; Zhang, X D; Yang, B C

    2017-01-01

    Bio-functionalization means to endow biomaterials with bio-functions so as to make the materials or devices more suitable for biomedical applications. Traditionally, because of the excellent mechanical properties, the biomedical metals have been widely used in clinic. However, the utilized functions are basically supporting or fixation especially for the implantable devices. Nowadays, some new functions, including bioactivity, anti-tumor, anti-microbial, and so on, are introduced to biomedical metals. To realize those bio-functions on the metallic biomedical materials, surface modification is the most commonly used method. Surface modification, including physical and chemical methods, is an effective way to alter the surface morphology and composition of biomaterials. It can endow the biomedical metals with new surface properties while still retain the good mechanical properties of the bulk material. Having analyzed the ways of realizing the bio-functionalization, this article briefly summarized the bio-functionalization concepts of six hot spots in this field. They are bioactivity, bony tissue inducing, anti-microbial, anti-tumor, anticoagulation, and drug loading functions. Copyright © 2016. Published by Elsevier B.V.

  11. Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics.

    PubMed

    Sadler, Brian M; Hoyos, Sebastian

    2014-01-01

    The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control.

  12. Education of biomedical engineering in Taiwan.

    PubMed

    Lin, Kang-Ping; Kao, Tsair; Wang, Jia-Jung; Chen, Mei-Jung; Su, Fong-Chin

    2014-01-01

    Biomedical Engineers (BME) play an important role in medical and healthcare society. Well educational programs are important to support the healthcare systems including hospitals, long term care organizations, manufacture industries of medical devices/instrumentations/systems, and sales/services companies of medical devices/instrumentations/system. In past 30 more years, biomedical engineering society has accumulated thousands people hold a biomedical engineering degree, and work as a biomedical engineer in Taiwan. Most of BME students can be trained in biomedical engineering departments with at least one of specialties in bioelectronics, bio-information, biomaterials or biomechanics. Students are required to have internship trainings in related institutions out of campus for 320 hours before graduating. Almost all the biomedical engineering departments are certified by IEET (Institute of Engineering Education Taiwan), and met the IEET requirement in which required mathematics and fundamental engineering courses. For BMEs after graduation, Taiwanese Society of Biomedical Engineering (TSBME) provides many continue-learning programs and certificates for all members who expect to hold the certification as a professional credit in his working place. In current status, many engineering departments in university are continuously asked to provide joint programs with BME department to train much better quality students. BME is one of growing fields in Taiwan.

  13. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  14. Biomedical Ontologies in Action: Role in Knowledge Management, Data Integration and Decision Support

    PubMed Central

    Bodenreider, O.

    2008-01-01

    Summary Objectives To provide typical examples of biomedical ontologies in action, emphasizing the role played by biomedical ontologies in knowledge management, data integration and decision support. Methods Biomedical ontologies selected for their practical impact are examined from a functional perspective. Examples of applications are taken from operational systems and the biomedical literature, with a bias towards recent journal articles. Results The ontologies under investigation in this survey include SNOMED CT, the Logical Observation Identifiers, Names, and Codes (LOINC), the Foundational Model of Anatomy, the Gene Ontology, RxNorm, the National Cancer Institute Thesaurus, the International Classification of Diseases, the Medical Subject Headings (MeSH) and the Unified Medical Language System (UMLS). The roles played by biomedical ontologies are classified into three major categories: knowledge management (indexing and retrieval of data and information, access to information, mapping among ontologies); data integration, exchange and semantic interoperability; and decision support and reasoning (data selection and aggregation, decision support, natural language processing applications, knowledge discovery). Conclusions Ontologies play an important role in biomedical research through a variety of applications. While ontologies are used primarily as a source of vocabulary for standardization and integration purposes, many applications also use them as a source of computable knowledge. Barriers to the use of ontologies in biomedical applications are discussed. PMID:18660879

  15. ISLE (Image and Signal LISP Environment): A functional language interface for signal and image processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azevedo, S.G.; Fitch, J.P.

    1987-10-21

    Conventional software interfaces that use imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal-processing software (SIG). As an alternative, ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal LISP Environment (ISLE) is an example of an interpreted functional language interface based on common LISP. Advantages of ISLE include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence (AI)more » software. 10 refs.« less

  16. ISLE (Image and Signal Lisp Environment): A functional language interface for signal and image processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azevedo, S.G.; Fitch, J.P.

    1987-05-01

    Conventional software interfaces which utilize imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal processing software (SIG). Existing ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal Lisp Environment (ISLE) will be discussed as an example of an interpreted functional language interface based on Common LISP. Additional benefits include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligencemore » software.« less

  17. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  18. GLAST Burst Monitor Signal Processing System

    NASA Astrophysics Data System (ADS)

    Bhat, P. Narayana; Briggs, Michael; Connaughton, Valerie; Diehl, Roland; Fishman, Gerald; Greiner, Jochen; Kippen, R. Marc; von Kienlin, Andreas; Kouveliotou, Chryssa; Lichti, Giselher; Meegan, Charles; Paciesas, William; Persyn, Steven; Preece, Robert; Steinle, Helmut; Wilson-Hodge, Colleen

    2007-07-01

    The onboard Data Processing Unit (DPU), designed and built by Southwest Research Institute, performs the high-speed data acquisition for GBM. The analog signals from each of the 14 detectors are digitized by high-speed multichannel analog data acquisition architecture. The streaming digital values resulting from a periodic (period of 104.2 ns) sampling of the analog signal by the individual ADCs are fed to a Field-Programmable Gate Array (FPGA). Real-time Digital Signal Processing (DSP) algorithms within the FPGA implement functions like filtering, thresholding, time delay and pulse height measurement. The spectral data with a 12-bit resolution are formatted according to the commandable look-up-table (LUT) and then sent to the High-Speed Science-Date Bus (HSSDB, speed=1.5 MB/s) to be telemetered to ground. The DSP offers a novel feature of a commandable & constant event deadtime. The ADC non-linearities have been calibrated so that the spectral data can be corrected during analysis. The best temporal resolution is 2 μs for the pre-burst & post-trigger time-tagged events (TTE) data. The time resolution of the binned data types is commandable from 64 msec to 1.024 s for the CTIME data (8 channel spectral resolution) and 1.024 to 32.768 s for the CSPEC data (128 channel spectral resolution). The pulse pile-up effects have been studied by Monte Carlo simulations. For a typical GRB, the possible shift in the Epeak value at high-count rates (~100 kHz) is ~1% while the change in the single power-law index could be up to 5%.

  19. GLAST Burst Monitor Signal Processing System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, P. Narayana; Briggs, Michael; Connaughton, Valerie

    The onboard Data Processing Unit (DPU), designed and built by Southwest Research Institute, performs the high-speed data acquisition for GBM. The analog signals from each of the 14 detectors are digitized by high-speed multichannel analog data acquisition architecture. The streaming digital values resulting from a periodic (period of 104.2 ns) sampling of the analog signal by the individual ADCs are fed to a Field-Programmable Gate Array (FPGA). Real-time Digital Signal Processing (DSP) algorithms within the FPGA implement functions like filtering, thresholding, time delay and pulse height measurement. The spectral data with a 12-bit resolution are formatted according to the commandablemore » look-up-table (LUT) and then sent to the High-Speed Science-Date Bus (HSSDB, speed=1.5 MB/s) to be telemetered to ground. The DSP offers a novel feature of a commandable and constant event deadtime. The ADC non-linearities have been calibrated so that the spectral data can be corrected during analysis. The best temporal resolution is 2 {mu}s for the pre-burst and post-trigger time-tagged events (TTE) data. The time resolution of the binned data types is commandable from 64 msec to 1.024 s for the CTIME data (8 channel spectral resolution) and 1.024 to 32.768 s for the CSPEC data (128 channel spectral resolution). The pulse pile-up effects have been studied by Monte Carlo simulations. For a typical GRB, the possible shift in the Epeak value at high-count rates ({approx}100 kHz) is {approx}1% while the change in the single power-law index could be up to 5%.« less

  20. A Versatile Multichannel Digital Signal Processing Module for Microcalorimeter Arrays

    NASA Astrophysics Data System (ADS)

    Tan, H.; Collins, J. W.; Walby, M.; Hennig, W.; Warburton, W. K.; Grudberg, P.

    2012-06-01

    Different techniques have been developed for reading out microcalorimeter sensor arrays: individual outputs for small arrays, and time-division or frequency-division or code-division multiplexing for large arrays. Typically, raw waveform data are first read out from the arrays using one of these techniques and then stored on computer hard drives for offline optimum filtering, leading not only to requirements for large storage space but also limitations on achievable count rate. Thus, a read-out module that is capable of processing microcalorimeter signals in real time will be highly desirable. We have developed multichannel digital signal processing electronics that are capable of on-board, real time processing of microcalorimeter sensor signals from multiplexed or individual pixel arrays. It is a 3U PXI module consisting of a standardized core processor board and a set of daughter boards. Each daughter board is designed to interface a specific type of microcalorimeter array to the core processor. The combination of the standardized core plus this set of easily designed and modified daughter boards results in a versatile data acquisition module that not only can easily expand to future detector systems, but is also low cost. In this paper, we first present the core processor/daughter board architecture, and then report the performance of an 8-channel daughter board, which digitizes individual pixel outputs at 1 MSPS with 16-bit precision. We will also introduce a time-division multiplexing type daughter board, which takes in time-division multiplexing signals through fiber-optic cables and then processes the digital signals to generate energy spectra in real time.

  1. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  2. Integrating a Hypernymic Proposition Interpreter into a Semantic Processor for Biomedical Texts

    PubMed Central

    Fiszman, Marcelo; Rindflesch, Thomas C.; Kilicoglu, Halil

    2003-01-01

    Semantic processing provides the potential for producing high quality results in natural language processing (NLP) applications in the biomedical domain. In this paper, we address a specific semantic phenomenon, the hypernymic proposition, and concentrate on integrating the interpretation of such predications into a more general semantic processor in order to improve overall accuracy. A preliminary evaluation assesses the contribution of hypernymic propositions in providing more specific semantic predications and thus improving effectiveness in retrieving treatment propositions in MEDLINE abstracts. Finally, we discuss the generalization of this methodology to additional semantic propositions as well as other types of biomedical texts. PMID:14728170

  3. Journal selection decisions: a biomedical library operations research model. I. The framework.

    PubMed Central

    Kraft, D H; Polacsek, R A; Soergel, L; Burns, K; Klair, A

    1976-01-01

    The problem of deciding which journal titles to select for acquisition in a biomedical library is modeled. The approach taken is based on cost/benefit ratios. Measures of journal worth, methods of data collection, and journal cost data are considered. The emphasis is on the development of a practical process for selecting journal titles, based on the objectivity and rationality of the model; and on the collection of the approprate data and library statistics in a reasonable manner. The implications of this process towards an overall management information system (MIS) for biomedical serials handling are discussed. PMID:820391

  4. A Study on Signal Group Processing of AUTOSAR COM Module

    NASA Astrophysics Data System (ADS)

    Lee, Jeong-Hwan; Hwang, Hyun Yong; Han, Tae Man; Ahn, Yong Hak

    2013-06-01

    In vehicle, there are many ECU(Electronic Control Unit)s, and ECUs are connected to networks such as CAN, LIN, FlexRay, and so on. AUTOSAR COM(Communication) which is a software platform of AUTOSAR(AUTomotive Open System ARchitecture) in the international industry standards of automotive electronic software processes signals and signal groups for data communications between ECUs. Real-time and reliability are very important for data communications in the vehicle. Therefore, in this paper, we analyze functions of signals and signal groups used in COM, and represent that functions of signal group are more efficient than signals in real-time data synchronization and network resource usage between the sender and receiver.

  5. SOA-based digital library services and composition in biomedical applications.

    PubMed

    Zhao, Xia; Liu, Enjie; Clapworthy, Gordon J; Viceconti, Marco; Testi, Debora

    2012-06-01

    Carefully collected, high-quality data are crucial in biomedical visualization, and it is important that the user community has ready access to both this data and the high-performance computing resources needed by the complex, computational algorithms that will process it. Biological researchers generally require data, tools and algorithms from multiple providers to achieve their goals. This paper illustrates our response to the problems that result from this. The Living Human Digital Library (LHDL) project presented in this paper has taken advantage of Web Services to build a biomedical digital library infrastructure that allows clinicians and researchers not only to preserve, trace and share data resources, but also to collaborate at the data-processing level. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. Optimal causal filtering for 1 /fα-type noise in single-electrode EEG signals.

    PubMed

    Paris, Alan; Atia, George; Vosoughi, Azadeh; Berman, Stephen A

    2016-08-01

    Understanding the mode of generation and the statistical structure of neurological noise is one of the central problems of biomedical signal processing. We have developed a broad class of abstract biological noise sources we call hidden simplicial tissues. In the simplest cases, such tissue emits what we have named generalized van der Ziel-McWhorter (GVZM) noise which has a roughly 1/fα spectral roll-off. Our previous work focused on the statistical structure of GVZM frequency spectra. However, causality of processing operations (i.e., dependence only on the past) is an essential requirement for real-time applications to seizure detection and brain-computer interfacing. In this paper we outline the theoretical background for optimal causal time-domain filtering of deterministic signals embedded in GVZM noise. We present some of our early findings concerning the optimal filtering of EEG signals for the detection of steady-state visual evoked potential (SSVEP) responses and indicate the next steps in our ongoing research.

  7. Investigation of optical current transformer signal processing method based on an improved Kalman algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan

    2018-01-01

    This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.

  8. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  9. Using Seismic Signals to Forecast Volcanic Processes

    NASA Astrophysics Data System (ADS)

    Salvage, R.; Neuberg, J. W.

    2012-04-01

    Understanding seismic signals generated during volcanic unrest have the ability to allow scientists to more accurately predict and understand active volcanoes since they are intrinsically linked to rock failure at depth (Voight, 1988). In particular, low frequency long period signals (LP events) have been related to the movement of fluid and the brittle failure of magma at depth due to high strain rates (Hammer and Neuberg, 2009). This fundamentally relates to surface processes. However, there is currently no physical quantitative model for determining the likelihood of an eruption following precursory seismic signals, or the timing or type of eruption that will ensue (Benson et al., 2010). Since the beginning of its current eruptive phase, accelerating LP swarms (< 10 events per hour) have been a common feature at Soufriere Hills volcano, Montserrat prior to surface expressions such as dome collapse or eruptions (Miller et al., 1998). The dynamical behaviour of such swarms can be related to accelerated magma ascent rates since the seismicity is thought to be a consequence of magma deformation as it rises to the surface. In particular, acceleration rates can be successfully used in collaboration with the inverse material failure law; a linear relationship against time (Voight, 1988); in the accurate prediction of volcanic eruption timings. Currently, this has only been investigated for retrospective events (Hammer and Neuberg, 2009). The identification of LP swarms on Montserrat and analysis of their dynamical characteristics allows a better understanding of the nature of the seismic signals themselves, as well as their relationship to surface processes such as magma extrusion rates. Acceleration and deceleration rates of seismic swarms provide insights into the plumbing system of the volcano at depth. The application of the material failure law to multiple LP swarms of data allows a critical evaluation of the accuracy of the method which further refines current

  10. Current Status of Biomedical Book Reviewing: Part IV. Major American and British Biomedical Book Publishers

    PubMed Central

    Chen, Ching-chih

    1974-01-01

    This is the fourth part of a comprehensive, quantitative study of biomedical book reviews. The data base of the total project was built from statistics of 3,347 reviews of 2,067 biomedical books taken from all 1970 issues of fifty-four reviewing journals. This part of the study identifies the major American and British biomedical book publishers in terms of their quantitative production of book titles reviewed, and determines the relationships among these publishers. It is found that Williams & Wilkins, Charles C Thomas, Academic Press, and Springer Verlag are the most productive biomedical book publishers in terms of books reviewed in 1970. These four publishers accounted for 32% of the 1,674 books available in the United States and reviewed in the reviewing media in 1970. Williams & Wilkins is especially significant by virtue of reprint activity. The present study also explores the price trend of biomedical books. It is found that the mean price for 1,077 books studied was $16.20 per volume, with a standard deviation of $9.42. PMID:4466508

  11. What is biomedical informatics?

    PubMed Central

    Bernstam, Elmer V.; Smith, Jack W.; Johnson, Todd R.

    2009-01-01

    Biomedical informatics lacks a clear and theoretically grounded definition. Many proposed definitions focus on data, information, and knowledge, but do not provide an adequate definition of these terms. Leveraging insights from the philosophy of information, we define informatics as the science of information, where information is data plus meaning. Biomedical informatics is the science of information as applied to or studied in the context of biomedicine. Defining the object of study of informatics as data plus meaning clearly distinguishes the field from related fields, such as computer science, statistics and biomedicine, which have different objects of study. The emphasis on data plus meaning also suggests that biomedical informatics problems tend to be difficult when they deal with concepts that are hard to capture using formal, computational definitions. In other words, problems where meaning must be considered are more difficult than problems where manipulating data without regard for meaning is sufficient. Furthermore, the definition implies that informatics research, teaching, and service should focus on biomedical information as data plus meaning rather than only computer applications in biomedicine. PMID:19683067

  12. Design and Performance of the Astro-E/XRS Signal Processing System

    NASA Technical Reports Server (NTRS)

    Boyce, Kevin R.; Audley, M. D.; Baker, R. G.; Dumonthier, J. J.; Fujimoto, R.; Gendreau, K. C.; Ishisaki, Y.; Kelley, R. L.; Stahle, C. K.; Szymkowiak, A. E.

    1999-01-01

    We describe the signal processing system of the Astro-E XRS instrument. The Calorimeter Analog Processor (CAP) provides bias and power for the detectors and amplifies the detector signals by a factor of 20,000. The Calorimeter Digital Processor (CDP) performs the digital processing of the calorimeter signals, detecting X-ray pulses and analyzing them by optimal filtering. We describe the operation of pulse detection, Pulse height analysis. and risetime determination. We also discuss performance, including the three event grades (hi-res mid-res, and low-res). anticoincidence detection, counting rate dependence, and noise rejection.

  13. Functional Electrospun Nanofibrous Scaffolds for Biomedical Applications

    PubMed Central

    Liang, Dehai; Hsiao, Benjamin S.; Chu, Benjamin

    2009-01-01

    Functional nanofibrous scaffolds produced by electrospinning have great potential in many biomedical applications, such as tissue engineering, wound dressing, enzyme immobilization and drug (gene) delivery. For a specific successful application, the chemical, physical and biological properties of electrospun scaffolds should be adjusted to match the environment by using a combination of multi-component compositions and fabrication techniques where electrospinning has often become a pivotal tool. The property of the nanofibrous scaffold can be further improved with innovative development in electrospinning processes, such as two-component electrospinning and in-situ mixing electrospinning. Post modifications of electrospun membranes also provide effective means to render the electrospun scaffolds with controlled anisotropy and porosity. In this review, we review the materials, techniques and post modification methods to functionalize electrospun nanofibrous scaffolds suitable for biomedical applications. PMID:17884240

  14. Maximizing the return on taxpayers' investments in fundamental biomedical research.

    PubMed

    Lorsch, Jon R

    2015-05-01

    The National Institute of General Medical Sciences (NIGMS) at the U.S. National Institutes of Health has an annual budget of more than $2.3 billion. The institute uses these funds to support fundamental biomedical research and training at universities, medical schools, and other institutions across the country. My job as director of NIGMS is to work to maximize the scientific returns on the taxpayers' investments. I describe how we are optimizing our investment strategies and funding mechanisms, and how, in the process, we hope to create a more efficient and sustainable biomedical research enterprise.

  15. Biomedical informatics and the convergence of Nano-Bio-Info-Cogno (NBIC) technologies.

    PubMed

    Martin-Sanchez, F; Maojo, V

    2009-01-01

    To analyze the role that biomedical informatics could play in the application of the NBIC Converging Technologies in the medical field and raise awareness of these new areas throughout the Biomedical Informatics community. Review of the literature and analysis of the reference documents in this domain from the biomedical informatics perspective. Detailing existing developments showing that partial convergence of technologies have already yielded relevant results in biomedicine (such as bioinformatics or biochips). Input from current projects in which the authors are involved is also used. Information processing is a key issue in enabling the convergence of NBIC technologies. Researchers in biomedical informatics are in a privileged position to participate and actively develop this new scientific direction. The experience of biomedical informaticians in five decades of research in the medical area and their involvement in the completion of the Human and other genome projects will help them participate in a similar role for the development of applications of converging technologies -particularly in nanomedicine. The proposed convergence will bring bridges between traditional disciplines. Particular attention should be placed on the ethical, legal, and social issues raised by the NBIC convergence. These technologies provide new directions for research and education in Biomedical Informatics placing a greater emphasis in multidisciplinary approaches.

  16. The next generation of similarity measures that fully explore the semantics in biomedical ontologies.

    PubMed

    Couto, Francisco M; Pinto, H Sofia

    2013-10-01

    There is a prominent trend to augment and improve the formality of biomedical ontologies. For example, this is shown by the current effort on adding description logic axioms, such as disjointness. One of the key ontology applications that can take advantage of this effort is the conceptual (functional) similarity measurement. The presence of description logic axioms in biomedical ontologies make the current structural or extensional approaches weaker and further away from providing sound semantics-based similarity measures. Although beneficial in small ontologies, the exploration of description logic axioms by semantics-based similarity measures is computational expensive. This limitation is critical for biomedical ontologies that normally contain thousands of concepts. Thus in the process of gaining their rightful place, biomedical functional similarity measures have to take the journey of finding how this rich and powerful knowledge can be fully explored while keeping feasible computational costs. This manuscript aims at promoting and guiding the development of compelling tools that deliver what the biomedical community will require in a near future: a next-generation of biomedical similarity measures that efficiently and fully explore the semantics present in biomedical ontologies.

  17. A novel biomedical image indexing and retrieval system via deep preference learning.

    PubMed

    Pang, Shuchao; Orgun, Mehmet A; Yu, Zhezhou

    2018-05-01

    The traditional biomedical image retrieval methods as well as content-based image retrieval (CBIR) methods originally designed for non-biomedical images either only consider using pixel and low-level features to describe an image or use deep features to describe images but still leave a lot of room for improving both accuracy and efficiency. In this work, we propose a new approach, which exploits deep learning technology to extract the high-level and compact features from biomedical images. The deep feature extraction process leverages multiple hidden layers to capture substantial feature structures of high-resolution images and represent them at different levels of abstraction, leading to an improved performance for indexing and retrieval of biomedical images. We exploit the current popular and multi-layered deep neural networks, namely, stacked denoising autoencoders (SDAE) and convolutional neural networks (CNN) to represent the discriminative features of biomedical images by transferring the feature representations and parameters of pre-trained deep neural networks from another domain. Moreover, in order to index all the images for finding the similarly referenced images, we also introduce preference learning technology to train and learn a kind of a preference model for the query image, which can output the similarity ranking list of images from a biomedical image database. To the best of our knowledge, this paper introduces preference learning technology for the first time into biomedical image retrieval. We evaluate the performance of two powerful algorithms based on our proposed system and compare them with those of popular biomedical image indexing approaches and existing regular image retrieval methods with detailed experiments over several well-known public biomedical image databases. Based on different criteria for the evaluation of retrieval performance, experimental results demonstrate that our proposed algorithms outperform the state

  18. Signal processing for ION mobility spectrometers

    NASA Technical Reports Server (NTRS)

    Taylor, S.; Hinton, M.; Turner, R.

    1995-01-01

    Signal processing techniques for systems based upon Ion Mobility Spectrometry will be discussed in the light of 10 years of experience in the design of real-time IMS. Among the topics to be covered are compensation techniques for variations in the number density of the gas - the use of an internal standard (a reference peak) or pressure and temperature sensors. Sources of noise and methods for noise reduction will be discussed together with resolution limitations and the ability of deconvolution techniques to improve resolving power. The use of neural networks (either by themselves or as a component part of a processing system) will be reviewed.

  19. Advances in Photopletysmography Signal Analysis for Biomedical Applications.

    PubMed

    Moraes, Jermana L; Rocha, Matheus X; Vasconcelos, Glauber G; Vasconcelos Filho, José E; de Albuquerque, Victor Hugo C; Alexandria, Auzuir R

    2018-06-09

    Heart Rate Variability (HRV) is an important tool for the analysis of a patient’s physiological conditions, as well a method aiding the diagnosis of cardiopathies. Photoplethysmography (PPG) is an optical technique applied in the monitoring of the HRV and its adoption has been growing significantly, compared to the most commonly used method in medicine, Electrocardiography (ECG). In this survey, definitions of these technique are presented, the different types of sensors used are explained, and the methods for the study and analysis of the PPG signal (linear and nonlinear methods) are described. Moreover, the progress, and the clinical and practical applicability of the PPG technique in the diagnosis of cardiovascular diseases are evaluated. In addition, the latest technologies utilized in the development of new tools for medical diagnosis are presented, such as Internet of Things, Internet of Health Things, genetic algorithms, artificial intelligence and biosensors which result in personalized advances in e-health and health care. After the study of these technologies, it can be noted that PPG associated with them is an important tool for the diagnosis of some diseases, due to its simplicity, its cost⁻benefit ratio, the easiness of signals acquisition, and especially because it is a non-invasive technique.

  20. Livestock in biomedical research: history, current status and future prospective.

    PubMed

    Polejaeva, Irina A; Rutigliano, Heloisa M; Wells, Kevin D

    2016-01-01

    Livestock models have contributed significantly to biomedical and surgical advances. Their contribution is particularly prominent in the areas of physiology and assisted reproductive technologies, including understanding developmental processes and disorders, from ancient to modern times. Over the past 25 years, biomedical research that traditionally embraced a diverse species approach shifted to a small number of model species (e.g. mice and rats). The initial reasons for focusing the main efforts on the mouse were the availability of murine embryonic stem cells (ESCs) and genome sequence data. This powerful combination allowed for precise manipulation of the mouse genome (knockouts, knockins, transcriptional switches etc.) leading to ground-breaking discoveries on gene functions and regulation, and their role in health and disease. Despite the enormous contribution to biomedical research, mouse models have some major limitations. Their substantial differences compared with humans in body and organ size, lifespan and inbreeding result in pronounced metabolic, physiological and behavioural differences. Comparative studies of strategically chosen domestic species can complement mouse research and yield more rigorous findings. Because genome sequence and gene manipulation tools are now available for farm animals (cattle, pigs, sheep and goats), a larger number of livestock genetically engineered (GE) models will be accessible for biomedical research. This paper discusses the use of cattle, goats, sheep and pigs in biomedical research, provides an overview of transgenic technology in farm animals and highlights some of the beneficial characteristics of large animal models of human disease compared with the mouse. In addition, status and origin of current regulation of GE biomedical models is also reviewed.

  1. Biomedical Engineering in Modern Society

    ERIC Educational Resources Information Center

    Attinger, E. O.

    1971-01-01

    Considers definition of biomedical engineering (BME) and how biomedical engineers should be trained. State of the art descriptions of BME and BME education are followed by a brief look at the future of BME. (TS)

  2. Myoelectric signal processing for control of powered limb prostheses.

    PubMed

    Parker, P; Englehart, K; Hudgins, B

    2006-12-01

    Progress in myoelectric control technology has over the years been incremental, due in part to the alternating focus of the R&D between control methodology and device hardware. The technology has over the past 50 years or so moved from single muscle control of a single prosthesis function to muscle group activity control of multifunction prostheses. Central to these changes have been developments in the means of extracting information from the myoelectric signal. This paper gives an overview of the myoelectric signal processing challenge, a brief look at the challenge from an historical perspective, the state-of-the-art in myoelectric signal processing for prosthesis control, and an indication of where this field is heading. The paper demonstrates that considerable progress has been made in providing clients with useful and reliable myoelectric communication channels, and that exciting work and developments are on the horizon.

  3. Signal processing for non-destructive testing of railway tracks

    NASA Astrophysics Data System (ADS)

    Heckel, Thomas; Casperson, Ralf; Rühe, Sven; Mook, Gerhard

    2018-04-01

    Increased speed, heavier loads, altered material and modern drive systems result in an increasing number of rail flaws. The appearance of these flaws also changes continually due to the rapid change in damage mechanisms of modern rolling stock. Hence, interpretation has become difficult when evaluating non-destructive rail testing results. Due to the changed interplay between detection methods and flaws, the recorded signals may result in unclassified types of rail flaws. Methods for automatic rail inspection (according to defect detection and classification) undergo continual development. Signal processing is a key technology to master the challenge of classification and maintain resolution and detection quality, independent of operation speed. The basic ideas of signal processing, based on the Glassy-Rail-Diagram for classification purposes, are presented herein. Examples for the detection of damages caused by rolling contact fatigue also are given, and synergetic effects of combined evaluation of diverse inspection methods are shown.

  4. Clique-based data mining for related genes in a biomedical database.

    PubMed

    Matsunaga, Tsutomu; Yonemori, Chikara; Tomita, Etsuji; Muramatsu, Masaaki

    2009-07-01

    Progress in the life sciences cannot be made without integrating biomedical knowledge on numerous genes in order to help formulate hypotheses on the genetic mechanisms behind various biological phenomena, including diseases. There is thus a strong need for a way to automatically and comprehensively search from biomedical databases for related genes, such as genes in the same families and genes encoding components of the same pathways. Here we address the extraction of related genes by searching for densely-connected subgraphs, which are modeled as cliques, in a biomedical relational graph. We constructed a graph whose nodes were gene or disease pages, and edges were the hyperlink connections between those pages in the Online Mendelian Inheritance in Man (OMIM) database. We obtained over 20,000 sets of related genes (called 'gene modules') by enumerating cliques computationally. The modules included genes in the same family, genes for proteins that form a complex, and genes for components of the same signaling pathway. The results of experiments using 'metabolic syndrome'-related gene modules show that the gene modules can be used to get a coherent holistic picture helpful for interpreting relations among genes. We presented a data mining approach extracting related genes by enumerating cliques. The extracted gene sets provide a holistic picture useful for comprehending complex disease mechanisms.

  5. A signal processing based analysis and prediction of seizure onset in patients with epilepsy

    PubMed Central

    Namazi, Hamidreza; Kulish, Vladimir V.

    2016-01-01

    One of the main areas of behavioural neuroscience is forecasting the human behaviour. Epilepsy is a central nervous system disorder in which nerve cell activity in the brain becomes disrupted, causing seizures or periods of unusual behaviour, sensations and sometimes loss of consciousness. An estimated 5% of the world population has epileptic seizure but there is not any method to cure it. More than 30% of people with epilepsy cannot control seizure. Epileptic seizure prediction, refers to forecasting the occurrence of epileptic seizures, is one of the most important but challenging problems in biomedical sciences, across the world. In this research we propose a new methodology which is based on studying the EEG signals using two measures, the Hurst exponent and fractal dimension. In order to validate the proposed method, it is applied to epileptic EEG signals of patients by computing the Hurst exponent and fractal dimension, and then the results are validated versus the reference data. The results of these analyses show that we are able to forecast the onset of a seizure on average of 25.76 seconds before the time of occurrence. PMID:26586477

  6. Surface energy modification for biomedical material by corona streamer plasma processing to mitigate bacterial adhesion

    NASA Astrophysics Data System (ADS)

    Alhamarneh, Ibrahim; Pedrow, Patrick

    2011-10-01

    Bacterial adhesion initiates biofouling of biomedical material but the processes can be reduced by adjusting the material's surface energy. The surface of surgical-grade 316L stainless steel (316L SS) had its hydrophilic property enhanced by processing in a corona streamer plasma reactor using atmospheric pressure Ar mixed with O2. Reactor excitation was 60 Hz ac high-voltage (<= 10 kV RMS) applied to a multi-needle-to-grounded-torus electrode configuration. Applied voltage and streamer current pulses were monitored with a broadband sensor system. When Ar/O2 plasma was used, the surface energy was enhanced more than with Ar plasma alone. Composition of the surface before and after plasma treatment was characterized by XPS. As the hydrophilicity of the treated surface increased so did percent of oxygen on the surface thus we concluded that reduction in contact angle was mainly due to new oxygen-containing functionalities. FTIR was used to identify oxygen containing groups on the surface. The aging effect that accompanies surface free energy adjustments was also observed.

  7. Digital Signal Processing in Acoustics--Part 2.

    ERIC Educational Resources Information Center

    Davies, H.; McNeill, D. J.

    1986-01-01

    Reviews the potential of a data acquisition system for illustrating the nature and significance of ideas in digital signal processing. Focuses on the fast Fourier transform and the utility of its two-channel format, emphasizing cross-correlation and its two-microphone technique of acoustic intensity measurement. Includes programing format. (ML)

  8. Displays, memories, and signal processing: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Articles on electronics systems and techniques were presented. The first section is on displays and other electro-optical systems; the second section is devoted to signal processing. The third section presented several new memory devices for digital equipment, including articles on holographic memories. The latest patent information available is also given.

  9. Unsupervised Structure Detection in Biomedical Data.

    PubMed

    Vogt, Julia E

    2015-01-01

    A major challenge in computational biology is to find simple representations of high-dimensional data that best reveal the underlying structure. In this work, we present an intuitive and easy-to-implement method based on ranked neighborhood comparisons that detects structure in unsupervised data. The method is based on ordering objects in terms of similarity and on the mutual overlap of nearest neighbors. This basic framework was originally introduced in the field of social network analysis to detect actor communities. We demonstrate that the same ideas can successfully be applied to biomedical data sets in order to reveal complex underlying structure. The algorithm is very efficient and works on distance data directly without requiring a vectorial embedding of data. Comprehensive experiments demonstrate the validity of this approach. Comparisons with state-of-the-art clustering methods show that the presented method outperforms hierarchical methods as well as density based clustering methods and model-based clustering. A further advantage of the method is that it simultaneously provides a visualization of the data. Especially in biomedical applications, the visualization of data can be used as a first pre-processing step when analyzing real world data sets to get an intuition of the underlying data structure. We apply this model to synthetic data as well as to various biomedical data sets which demonstrate the high quality and usefulness of the inferred structure.

  10. Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics

    PubMed Central

    Sadler, Brian M; Hoyos, Sebastian

    2014-01-01

    The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control. PMID:26601042

  11. Signal and noise modeling in confocal laser scanning fluorescence microscopy.

    PubMed

    Herberich, Gerlind; Windoffer, Reinhard; Leube, Rudolf E; Aach, Til

    2012-01-01

    Fluorescence confocal laser scanning microscopy (CLSM) has revolutionized imaging of subcellular structures in biomedical research by enabling the acquisition of 3D time-series of fluorescently-tagged proteins in living cells, hence forming the basis for an automated quantification of their morphological and dynamic characteristics. Due to the inherently weak fluorescence, CLSM images exhibit a low SNR. We present a novel model for the transfer of signal and noise in CLSM that is both theoretically sound as well as corroborated by a rigorous analysis of the pixel intensity statistics via measurement of the 3D noise power spectra, signal-dependence and distribution. Our model provides a better fit to the data than previously proposed models. Further, it forms the basis for (i) the simulation of the CLSM imaging process indispensable for the quantitative evaluation of CLSM image analysis algorithms, (ii) the application of Poisson denoising algorithms and (iii) the reconstruction of the fluorescence signal.

  12. Crosstalk between Wnt Signaling and RNA Processing in Colorectal Cancer.

    PubMed

    Bordonaro, Michael

    2013-01-01

    RNA processing involves a variety of processes affecting gene expression, including the removal of introns through RNA splicing, as well as 3' end processing (cleavage and polyadenylation). Alternative RNA processing is fundamentally important for gene regulation, and aberrant processing is associated with the initiation and progression of cancer. Deregulated Wnt signaling, which is the initiating event in the development of most cases of human colorectal cancer (CRC), has been linked to modified RNA processing, which may contribute to Wnt-mediated colonic carcinogenesis. Crosstalk between Wnt signaling and alternative RNA splicing with relevance to CRC includes effects on the expression of Rac1b, an alternatively spliced gene associated with tumorigenesis, which exhibits alternative RNA splicing that is influenced by Wnt activity. In addition, Tcf4, a crucial component of Wnt signaling, also exhibits alternative splicing, which is likely involved in colonic tumorigenesis. Modulation of 3' end formation, including of the Wnt target gene COX-2, also can influence the neoplastic process, with implications for CRC. While many human genes are dependent on introns and splicing for normal levels of gene expression, naturally intronless genes exist with a unique metabolism that allows for intron-independent gene expression. Effects of Wnt activity on the RNA metabolism of the intronless Wnt-target gene c-jun is a likely contributor to cancer development. Further, butyrate, a breakdown product of dietary fiber and a histone deacetylase inhibitor, upregulates Wnt activity in CRC cells, and also modulates RNA processing; therefore, the interplay between Wnt activity, the modulation of this activity by butyrate, and differential RNA metabolism in colonic cells can significantly influence tumorigenesis. Determining the role played by altered RNA processing in Wnt-mediated neoplasia may lead to novel interventions aimed at restoring normal RNA metabolism for therapeutic benefit

  13. Digital seismo-acoustic signal processing aboard a wireless sensor platform

    NASA Astrophysics Data System (ADS)

    Marcillo, O.; Johnson, J. B.; Lorincz, K.; Werner-Allen, G.; Welsh, M.

    2006-12-01

    We are developing a low power, low-cost wireless sensor array to conduct real-time signal processing of earthquakes at active volcanoes. The sensor array, which integrates data from both seismic and acoustic sensors, is based on Moteiv TMote Sky wireless sensor nodes (www.moteiv.com). The nodes feature a Texas Instruments MSP430 microcontroller, 48 Kbytes of program memory, 10 Kbytes of static RAM, 1 Mbyte of external flash memory, and a 2.4-GHz Chipcon CC2420 IEEE 802.15.4 radio. The TMote Sky is programmed in TinyOS. Basic signal processing occurs on an array of three peripheral sensor nodes. These nodes are tied into a dedicated GPS receiver node, which is focused on time synchronization, and a central communications node, which handles data integration and additional processing. The sensor nodes incorporate dual 12-bit digitizers sampling a seismic sensor and a pressure transducer at 100 samples per second. The wireless capabilities of the system allow flexible array geometry, with a maximum aperture of 200m. We have already developed the digital signal processing routines on board the Moteiv Tmote sensor nodes. The developed routines accomplish Real-time Seismic-Amplitude Measurement (RSAM), Seismic Spectral- Amplitude Measurement (SSAM), and a user-configured Short Term Averaging / Long Term Averaging (STA LTA ratio), which is used to calculate first arrivals. The processed data from individual nodes are transmitted back to a central node, where additional processing may be performed. Such processing will include back azimuth determination and other wave field analyses. Future on-board signal processing will focus on event characterization utilizing pattern recognition and spectral characterization. The processed data is intended as low bandwidth information which can be transmitted periodically and at low cost through satellite telemetry to a web server. The processing is limited by the computational capabilities (RAM, ROM) of the nodes. Nevertheless, we

  14. Recent advances in bulk metallic glasses for biomedical applications.

    PubMed

    Li, H F; Zheng, Y F

    2016-05-01

    With a continuously increasing aging population and the improvement of living standards, large demands of biomaterials are expected for a long time to come. Further development of novel biomaterials, that are much safer and of much higher quality, in terms of both biomedical and mechanical properties, are therefore of great interest for both the research scientists and clinical surgeons. Compared with the conventional crystalline metallic counterparts, bulk metallic glasses have unique amorphous structures, and thus exhibit higher strength, lower Young's modulus, improved wear resistance, good fatigue endurance, and excellent corrosion resistance. For this purpose, bulk metallic glasses (BMGs) have recently attracted much attention for biomedical applications. This review discusses and summarizes the recent developments and advances of bulk metallic glasses, including Ti-based, Zr-based, Fe-based, Mg-based, Zn-based, Ca-based and Sr-based alloying systems for biomedical applications. Future research directions will move towards overcoming the brittleness, increasing the glass forming ability (GFA) thus obtaining corresponding bulk metallic glasses with larger sizes, removing/reducing toxic elements, and surface modifications. Bulk metallic glasses (BMGs), also known as amorphous alloys or liquid metals, are relative newcomers in the field of biomaterials. They have gained increasing attention during the past decades, as they exhibit an excellent combination of properties and processing capabilities desired for versatile biomedical implant applications. The present work reviewed the recent developments and advances of biomedical BMGs, including Ti-based, Zr-based, Fe-based, Mg-based, Zn-based, Ca-based and Sr-based BMG alloying systems. Besides, the critical analysis and in-depth discussion on the current status, challenge and future development of biomedical BMGs are included. The possible solution to the BMG size limitation, the brittleness of BMGs has been

  15. A user's guide for the signal processing software for image and speech compression developed in the Communications and Signal Processing Laboratory (CSPL), version 1

    NASA Technical Reports Server (NTRS)

    Kumar, P.; Lin, F. Y.; Vaishampayan, V.; Farvardin, N.

    1986-01-01

    A complete documentation of the software developed in the Communication and Signal Processing Laboratory (CSPL) during the period of July 1985 to March 1986 is provided. Utility programs and subroutines that were developed for a user-friendly image and speech processing environment are described. Additional programs for data compression of image and speech type signals are included. Also, programs for the zero-memory and block transform quantization in the presence of channel noise are described. Finally, several routines for simulating the perfromance of image compression algorithms are included.

  16. Nonlinear Blind Compensation for Array Signal Processing Application

    PubMed Central

    Ma, Hong; Jin, Jiang; Zhang, Hua

    2018-01-01

    Recently, nonlinear blind compensation technique has attracted growing attention in array signal processing application. However, due to the nonlinear distortion stemming from array receiver which consists of multi-channel radio frequency (RF) front-ends, it is too difficult to estimate the parameters of array signal accurately. A novel nonlinear blind compensation algorithm aims at the nonlinearity mitigation of array receiver and its spurious-free dynamic range (SFDR) improvement, which will be more precise to estimate the parameters of target signals such as their two-dimensional directions of arrival (2-D DOAs). Herein, the suggested method is designed as follows: the nonlinear model parameters of any channel of RF front-end are extracted to synchronously compensate the nonlinear distortion of the entire receiver. Furthermore, a verification experiment on the array signal from a uniform circular array (UCA) is adopted to testify the validity of our approach. The real-world experimental results show that the SFDR of the receiver is enhanced, leading to a significant improvement of the 2-D DOAs estimation performance for weak target signals. And these results demonstrate that our nonlinear blind compensation algorithm is effective to estimate the parameters of weak array signal in concomitance with strong jammers. PMID:29690571

  17. Biophoton signal transmission and processing in the brain.

    PubMed

    Tang, Rendong; Dai, Jiapei

    2014-10-05

    The transmission and processing of neural information in the nervous system plays a key role in neural functions. It is well accepted that neural communication is mediated by bioelectricity and chemical molecules via the processes called bioelectrical and chemical transmission, respectively. Indeed, the traditional theories seem to give valuable explanations for the basic functions of the nervous system, but difficult to construct general accepted concepts or principles to provide reasonable explanations of higher brain functions and mental activities, such as perception, learning and memory, emotion and consciousness. Therefore, many unanswered questions and debates over the neural encoding and mechanisms of neuronal networks remain. Cell to cell communication by biophotons, also called ultra-weak photon emissions, has been demonstrated in several plants, bacteria and certain animal cells. Recently, both experimental evidence and theoretical speculation have suggested that biophotons may play a potential role in neural signal transmission and processing, contributing to the understanding of the high functions of nervous system. In this paper, we review the relevant experimental findings and discuss the possible underlying mechanisms of biophoton signal transmission and processing in the nervous system. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Electrical Characterization of Signal Processing Microcircuit

    DTIC Science & Technology

    1989-04-01

    Transistor Array 14 Liner Microcircuits Analog Switches Analog MUX Device Characterization Analog Multiplexer References nS report Covere tV Whe m ^~~ 11Ur...ity Assurance Branch of the Rome Air Development Center pertainIng to the electrical characterization and MIL- M -38510 specifi- cation of analog...PAGI ELECTRICAL CHARACTERIZATION OF SIGNAL PROCESSING MICROCIRCUITS SECTION TITLE PAGE I Introduction I-i II Analog Multipliers, MIL- M -38510/139 II-i III

  19. Optical signal processing of spatially distributed sensor data in smart structures

    NASA Technical Reports Server (NTRS)

    Bennett, K. D.; Claus, R. O.; Murphy, K. A.; Goette, A. M.

    1989-01-01

    Smart structures which contain dense two- or three-dimensional arrays of attached or embedded sensor elements inherently require signal multiplexing and processing capabilities to permit good spatial data resolution as well as the adequately short calculation times demanded by real time active feedback actuator drive circuitry. This paper reports the implementation of an in-line optical signal processor and its application in a structural sensing system which incorporates multiple discrete optical fiber sensor elements. The signal processor consists of an array of optical fiber couplers having tailored s-parameters and arranged to allow gray code amplitude scaling of sensor inputs. The use of this signal processor in systems designed to indicate the location of distributed strain and damage in composite materials, as well as to quantitatively characterize that damage, is described. Extension of similar signal processing methods to more complicated smart materials and structures applications are discussed.

  20. Engineering Cell-Cell Signaling

    PubMed Central

    Milano, Daniel F.; Natividad, Robert J.; Asthagiri, Anand R.

    2014-01-01

    Juxtacrine cell-cell signaling mediated by the direct interaction of adjoining mammalian cells is arguably the mode of cell communication that is most recalcitrant to engineering. Overcoming this challenge is crucial for progress in biomedical applications, such as tissue engineering, regenerative medicine, immune system engineering and therapeutic design. Here, we describe the significant advances that have been made in developing synthetic platforms (materials and devices) and synthetic cells (cell surface engineering and synthetic gene circuits) to modulate juxtacrine cell-cell signaling. In addition, significant progress has been made in elucidating design rules and strategies to modulate juxtacrine signaling based on quantitative, engineering analysis of the mechanical and regulatory role of juxtacrine signals in the context of other cues and physical constraints in the microenvironment. These advances in engineering juxtacrine signaling lay a strong foundation for an integrative approach to utilizing synthetic cells, advanced ‘chassis’ and predictive modeling to engineer the form and function of living tissues. PMID:23856592

  1. Advanced detectors and signal processing

    NASA Technical Reports Server (NTRS)

    Greve, D. W.; Rasky, P. H. L.; Kryder, M. H.

    1986-01-01

    Continued progress is reported toward development of a silicon on garnet technology which would allow fabrication of advanced detection and signal processing circuits on bubble memories. The first integrated detectors and propagation patterns have been designed and incorporated on a new mask set. In addition, annealing studies on spacer layers are performed. Based on those studies, a new double layer spacer is proposed which should reduce contamination of the silicon originating in the substrate. Finally, the magnetic sensitivity of uncontaminated detectors from the last lot of wafers is measured. The measured sensitivity is lower than anticipated but still higher than present magnetoresistive detectors.

  2. Signal Processing Expert Code (SPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  3. The diversity of experimental organisms in biomedical research may be influenced by biomedical funding.

    PubMed

    Erick Peirson, B R; Kropp, Heather; Damerow, Julia; Laubichler, Manfred D

    2017-05-01

    Contrary to concerns of some critics, we present evidence that biomedical research is not dominated by a small handful of model organisms. An exhaustive analysis of research literature suggests that the diversity of experimental organisms in biomedical research has increased substantially since 1975. There has been a longstanding worry that organism-centric funding policies can lead to biases in experimental organism choice, and thus negatively impact the direction of research and the interpretation of results. Critics have argued that a focus on model organisms has unduly constrained the diversity of experimental organisms. The availability of large electronic databases of scientific literature, combined with interest in quantitative methods among philosophers of science, presents new opportunities for data-driven investigations into organism choice in biomedical research. The diversity of organisms used in NIH-funded research may be considerably lower than in the broader biomedical sciences, and may be subject to greater constraints on organism choice. © 2017 WILEY Periodicals, Inc.

  4. Rising Expectations: Access to Biomedical Information

    PubMed Central

    Lindberg, D. A. B.; Humphreys, B. L.

    2008-01-01

    Summary Objective To provide an overview of the expansion in public access to electronic biomedical information over the past two decades, with an emphasis on developments to which the U.S. National Library of Medicine contributed. Methods Review of the increasingly broad spectrum of web-accessible genomic data, biomedical literature, consumer health information, clinical trials data, and images. Results The amount of publicly available electronic biomedical information has increased dramatically over the past twenty years. Rising expectations regarding access to biomedical information were stimulated by the spread of the Internet, the World Wide Web, advanced searching and linking techniques. These informatics advances simplified and improved access to electronic information and reduced costs, which enabled inter-organizational collaborations to build and maintain large international information resources and also aided outreach and education efforts The demonstrated benefits of free access to electronic biomedical information encouraged the development of public policies that further increase the amount of information available. Conclusions Continuing rapid growth of publicly accessible electronic biomedical information presents tremendous opportunities and challenges, including the need to ensure uninterrupted access during disasters or emergencies and to manage digital resources so they remain available for future generations. PMID:18587496

  5. Total focusing method with correlation processing of antenna array signals

    NASA Astrophysics Data System (ADS)

    Kozhemyak, O. A.; Bortalevich, S. I.; Loginov, E. L.; Shinyakov, Y. A.; Sukhorukov, M. P.

    2018-03-01

    The article proposes a method of preliminary correlation processing of a complete set of antenna array signals used in the image reconstruction algorithm. The results of experimental studies of 3D reconstruction of various reflectors using and without correlation processing are presented in the article. Software ‘IDealSystem3D’ by IDeal-Technologies was used for experiments. Copper wires of different diameters located in a water bath were used as a reflector. The use of correlation processing makes it possible to obtain more accurate reconstruction of the image of the reflectors and to increase the signal-to-noise ratio. The experimental results were processed using an original program. This program allows varying the parameters of the antenna array and sampling frequency.

  6. Publishing priorities of biomedical research funders

    PubMed Central

    Collins, Ellen

    2013-01-01

    Objectives To understand the publishing priorities, especially in relation to open access, of 10 UK biomedical research funders. Design Semistructured interviews. Setting 10 UK biomedical research funders. Participants 12 employees with responsibility for research management at 10 UK biomedical research funders; a purposive sample to represent a range of backgrounds and organisation types. Conclusions Publicly funded and large biomedical research funders are committed to open access publishing and are pleased with recent developments which have stimulated growth in this area. Smaller charitable funders are supportive of the aims of open access, but are concerned about the practical implications for their budgets and their funded researchers. Across the board, biomedical research funders are turning their attention to other priorities for sharing research outputs, including data, protocols and negative results. Further work is required to understand how smaller funders, including charitable funders, can support open access. PMID:24154520

  7. Biological Signal Processing with a Genetic Toggle Switch

    PubMed Central

    Hillenbrand, Patrick; Fritz, Georg; Gerland, Ulrich

    2013-01-01

    Complex gene regulation requires responses that depend not only on the current levels of input signals but also on signals received in the past. In digital electronics, logic circuits with this property are referred to as sequential logic, in contrast to the simpler combinatorial logic without such internal memory. In molecular biology, memory is implemented in various forms such as biochemical modification of proteins or multistable gene circuits, but the design of the regulatory interface, which processes the input signals and the memory content, is often not well understood. Here, we explore design constraints for such regulatory interfaces using coarse-grained nonlinear models and stochastic simulations of detailed biochemical reaction networks. We test different designs for biological analogs of the most versatile memory element in digital electronics, the JK-latch. Our analysis shows that simple protein-protein interactions and protein-DNA binding are sufficient, in principle, to implement genetic circuits with the capabilities of a JK-latch. However, it also exposes fundamental limitations to its reliability, due to the fact that biological signal processing is asynchronous, in contrast to most digital electronics systems that feature a central clock to orchestrate the timing of all operations. We describe a seemingly natural way to improve the reliability by invoking the master-slave concept from digital electronics design. This concept could be useful to interpret the design of natural regulatory circuits, and for the design of synthetic biological systems. PMID:23874595

  8. Biomedical Applications of Nanodiamonds: An Overview.

    PubMed

    Passeri, D; Rinaldi, F; Ingallina, C; Carafa, M; Rossi, M; Terranova, M L; Marianecci, C

    2015-02-01

    Nanodiamonds are a novel class of nanomaterials which have raised much attention for application in biomedical field, as they combine the possibility of being produced on large scale using relatively inexpensive synthetic processes, of being fluorescent as a consequence of the presence of nitrogen vacancies, of having their surfaces functionalized, and of having good biocompatibility. Among other applications, we mainly focus on drug delivery, including cell interaction, targeting, cancer therapy, gene and protein delivery. In addition, nanodiamonds for bone and dental implants and for antibacterial use is discussed. Techniques for detection and imaging of nanodiamonds in biological tissues are also reviewed, including electron microscopy, fluorescence microscopy, Raman mapping, atomic force microscopy, thermal imaging, magnetic resonance imaging, and positron emission tomography, either in vitro, in vivo, or ex vivo. Toxicological aspects related to the use of nanodiamonds are also discussed. Finally, patents, preclinical and clinical trials based on the use of nanodiamonds for biomedical applications are reviewed.

  9. Signal processing in an acousto-optical spectral colorimeter

    NASA Astrophysics Data System (ADS)

    Emeljanov, Sergey P.; Kludzin, Victor V.; Kochin, Leonid B.; Medvedev, Sergey V.; Polosin, Lev L.; Sokolov, Vladimir K.

    2002-02-01

    The algorithms of spectrometer signals processing in the acousto-optical spectral colorimeter, proposed earlier are discussed. This processing is directional on distortion elimination of an optical system spectral characteristics and photoelectric transformations, and also for calculation of tristimulus coefficients X,Y,Z in an international colorimetric system of a CIE - 31 and transformation them in coordinates of recommended CIE uniform contrast systems LUV and LAB.

  10. Differences in signal peptide processing between GP3 glycoproteins of Arteriviridae.

    PubMed

    Zhang, Minze; Veit, Michael

    2018-04-01

    We reported previously that carbohydrate attachment to an overlapping glycosylation site adjacent to the signal peptide of GP3 from equine arteritis virus (EAV) prevents cleavage. Here we investigated whether this unusual processing scheme is a feature of GP3s of other Arteriviridae, which all contain a glycosylation site at a similar position. Expression of GP3 from type-1 and type-2 porcine reproductive and respiratory syndrome virus (PRRSV) and from lactate dehydrogenase-elevating virus (LDV) revealed that the first glycosylation site is used, but has no effect on signal peptide cleavage. Comparison of the SDS-PAGE mobility of deglycosylated GP3 from PRRSV and LDV with mutants having or not having a signal peptide showed that GP3´s signal peptide is cleaved. Swapping the signal peptides between GP3 of EAV and PRRSV revealed that the information for co-translational processing is not encoded in the signal peptide, but in the remaining part of GP3. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. BioStar models of clinical and genomic data for biomedical data warehouse design

    PubMed Central

    Wang, Liangjiang; Ramanathan, Murali

    2008-01-01

    Biomedical research is now generating large amounts of data, ranging from clinical test results to microarray gene expression profiles. The scale and complexity of these datasets give rise to substantial challenges in data management and analysis. It is highly desirable that data warehousing and online analytical processing technologies can be applied to biomedical data integration and mining. The major difficulty probably lies in the task of capturing and modelling diverse biological objects and their complex relationships. This paper describes multidimensional data modelling for biomedical data warehouse design. Since the conventional models such as star schema appear to be insufficient for modelling clinical and genomic data, we develop a new model called BioStar schema. The new model can capture the rich semantics of biomedical data and provide greater extensibility for the fast evolution of biological research methodologies. PMID:18048122

  12. Adaptive signal processing at NOSC

    NASA Astrophysics Data System (ADS)

    Albert, T. R.

    1992-03-01

    Adaptive signal processing work at the Naval Ocean Systems Center (NOSC) dates back to the late 1960s. It began as an IR/IED project by John McCool, who made use of an adaptive algorithm that had been developed by Professor Bernard Widrow of Stanford University. In 1972, a team lead by McCool built the first hardware implementation of the algorithm that could process in real-time at acoustic bandwidths. Early tests with the two units that were built were extremely successful, and attracted much attention. Sponsors from different commands provided funding to develop hardware for submarine, surface ship, airborne, and other systems. In addition, an effort was initiated to analyze performance and behavior of the algorithm. Most of the hardware development and analysis efforts were active through the 1970s, and a few into the 1980s. One of the original programs continues to this date.

  13. [Ethics and biomedical research].

    PubMed

    Goussard, Christophe

    2007-01-01

    Ethics in biomedical research took off from the 1947 Nuremberg Code to its own right in the wake of the Declaration of Helsinki in 1964. Since then, (inter)national regulations and guidelines providing a framework for clinical studies and protection for study participants have been drafted and implemented, while ethics committees and drug evaluation agencies have sprung up throughout the world. These two developments were crucial in bringing about the protection of rights and safety of the participants and harmonization of the conduct of biomedical research. Ethics committees and drug evaluation agencies deliver ethical and scientific assessments on the quality and safety of the projects submitted to them and issue respectively approvals and authorizations to carry out clinical trials, while ensuring that they comply with regulatory requirements, ethical principles, and scientific guidelines. The advent of biomedical ethics, together with the responsible commitment of clinical investigators and of the pharmaceutical industry, has guaranteed respect for the patient, for whom and with whom research is conducted. Just as importantly, it has also ensured that patients reap the benefit of what is the primary objective of biomedical research: greater life expectancy, well-being, and quality of life.

  14. Environmental/Biomedical Terminology Index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huffstetler, J.K.; Dailey, N.S.; Rickert, L.W.

    1976-12-01

    The Information Center Complex (ICC), a centrally administered group of information centers, provides information support to environmental and biomedical research groups and others within and outside Oak Ridge National Laboratory. In-house data base building and development of specialized document collections are important elements of the ongoing activities of these centers. ICC groups must be concerned with language which will adequately classify and insure retrievability of document records. Language control problems are compounded when the complexity of modern scientific problem solving demands an interdisciplinary approach. Although there are several word lists, indexes, and thesauri specific to various scientific disciplines usually groupedmore » as Environmental Sciences, no single generally recognized authority can be used as a guide to the terminology of all environmental science. If biomedical terminology for the description of research on environmental effects is also needed, the problem becomes even more complex. The building of a word list which can be used as a general guide to the environmental/biomedical sciences has been a continuing activity of the Information Center Complex. This activity resulted in the publication of the Environmental Biomedical Terminology Index (EBTI).« less

  15. Fourier analysis and signal processing by use of the Moebius inversion formula

    NASA Technical Reports Server (NTRS)

    Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.

    1990-01-01

    A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.

  16. LWT Based Sensor Node Signal Processing in Vehicle Surveillance Distributed Sensor Network

    NASA Astrophysics Data System (ADS)

    Cha, Daehyun; Hwang, Chansik

    Previous vehicle surveillance researches on distributed sensor network focused on overcoming power limitation and communication bandwidth constraints in sensor node. In spite of this constraints, vehicle surveillance sensor node must have signal compression, feature extraction, target localization, noise cancellation and collaborative signal processing with low computation and communication energy dissipation. In this paper, we introduce an algorithm for light-weight wireless sensor node signal processing based on lifting scheme wavelet analysis feature extraction in distributed sensor network.

  17. New Software Developments for Quality Mesh Generation and Optimization from Biomedical Imaging Data

    PubMed Central

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2013-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. PMID:24252469

  18. New software developments for quality mesh generation and optimization from biomedical imaging data.

    PubMed

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2014-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Bridging medicine and biomedical technology: enhance translation of fundamental research to patient care

    PubMed Central

    Raff, Adam B.; Seiler, Theo G.; Apiou-Sbirlea, Gabriela

    2017-01-01

    The ‘Bridging medicine and biomedical technology’ special all-congress session took place for the first time at the OSA Biophotonics Congress: Optics in Life Sciences in 2017 (http://www.osa.org/enus/meetings/osa_meetings/optics_in_the_life_sciences/bridging_medicine_and_biomedical_technology_specia/). The purpose was to identify key challenges the biomedical scientists in academia have to overcome to translate their discoveries into clinical practice through robust collaborations with industry and discuss best practices to facilitate and accelerate the process. Our paper is intended to complement the session by providing a deeper insight into the concept behind the structure and the content we developed. PMID:29296473

  20. Functionalized carbon nanotubes: biomedical applications

    PubMed Central

    Vardharajula, Sandhya; Ali, Sk Z; Tiwari, Pooja M; Eroğlu, Erdal; Vig, Komal; Dennis, Vida A; Singh, Shree R

    2012-01-01

    Carbon nanotubes (CNTs) are emerging as novel nanomaterials for various biomedical applications. CNTs can be used to deliver a variety of therapeutic agents, including biomolecules, to the target disease sites. In addition, their unparalleled optical and electrical properties make them excellent candidates for bioimaging and other biomedical applications. However, the high cytotoxicity of CNTs limits their use in humans and many biological systems. The biocompatibility and low cytotoxicity of CNTs are attributed to size, dose, duration, testing systems, and surface functionalization. The functionalization of CNTs improves their solubility and biocompatibility and alters their cellular interaction pathways, resulting in much-reduced cytotoxic effects. Functionalized CNTs are promising novel materials for a variety of biomedical applications. These potential applications are particularly enhanced by their ability to penetrate biological membranes with relatively low cytotoxicity. This review is directed towards the overview of CNTs and their functionalization for biomedical applications with minimal cytotoxicity. PMID:23091380

  1. Functionalized carbon nanotubes: biomedical applications.

    PubMed

    Vardharajula, Sandhya; Ali, Sk Z; Tiwari, Pooja M; Eroğlu, Erdal; Vig, Komal; Dennis, Vida A; Singh, Shree R

    2012-01-01

    Carbon nanotubes (CNTs) are emerging as novel nanomaterials for various biomedical applications. CNTs can be used to deliver a variety of therapeutic agents, including biomolecules, to the target disease sites. In addition, their unparalleled optical and electrical properties make them excellent candidates for bioimaging and other biomedical applications. However, the high cytotoxicity of CNTs limits their use in humans and many biological systems. The biocompatibility and low cytotoxicity of CNTs are attributed to size, dose, duration, testing systems, and surface functionalization. The functionalization of CNTs improves their solubility and biocompatibility and alters their cellular interaction pathways, resulting in much-reduced cytotoxic effects. Functionalized CNTs are promising novel materials for a variety of biomedical applications. These potential applications are particularly enhanced by their ability to penetrate biological membranes with relatively low cytotoxicity. This review is directed towards the overview of CNTs and their functionalization for biomedical applications with minimal cytotoxicity.

  2. Conformation-based signal transfer and processing at the single-molecule level

    NASA Astrophysics Data System (ADS)

    Li, Chao; Wang, Zhongping; Lu, Yan; Liu, Xiaoqing; Wang, Li

    2017-11-01

    Building electronic components made of individual molecules is a promising strategy for the miniaturization and integration of electronic devices. However, the practical realization of molecular devices and circuits for signal transmission and processing at room temperature has proven challenging. Here, we present room-temperature intermolecular signal transfer and processing using SnCl2Pc molecules on a Cu(100) surface. The in-plane orientations of the molecules are effectively coupled via intermolecular interaction and serve as the information carrier. In the coupled molecular arrays, the signal can be transferred from one molecule to another in the in-plane direction along predesigned routes and processed to realize logical operations. These phenomena enable the use of molecules displaying intrinsic bistable states as complex molecular devices and circuits with novel functions.

  3. Benefits of Software GPS Receivers for Enhanced Signal Processing

    DTIC Science & Technology

    2000-01-01

    1 Published in GPS SOLUTIONS 4(1) Summer, 2000, pages 56-66. Benefits of Software GPS Receivers for Enhanced Signal Processing Alison Brown...Diego, CA 92110-3127 Number of Pages: 24 Number of Figures: 20 ABSTRACT In this paper the architecture of a software GPS receiver is described...and an analysis is included of the performance of a software GPS receiver when tracking the GPS signals in challenging environments. Results are

  4. Analog integrated circuits design for processing physiological signals.

    PubMed

    Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting

    2010-01-01

    Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.

  5. Biomedical application of MALDI mass spectrometry for small-molecule analysis.

    PubMed

    van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M

    2011-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.

  6. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1988-07-01

    Principal Investigator B. K. Jenkins Signal and Image Processing Institute University of Southern California Mail Code 0272 Los Angeles, California...ADDRESS (09% SteW. Mnd ZIP Code ) 10. SOURC OF FUNONG NUMBERS Bldg. 410, Bolling AFB PROGAM CT TASK WORK UNIT Washington, D.C. 20332 EEETP.aso o 11...TAB Unmnnncced Justification By Distribution/ I O’ Availablility Codes I - ’_ ji and/or 2 I Summary During the period 1 July 1987 - 30 June 1988, the

  7. Resolving Complex Research Data Management Issues in Biomedical Laboratories: Qualitative Study of an Industry-Academia Collaboration

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.

    2016-01-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980

  8. Acoustic emission signal processing for rolling bearing running state assessment using compressive sensing

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Wu, Xing; Mao, Jianlin; Liu, Xiaoqin

    2017-07-01

    In the signal processing domain, there has been growing interest in using acoustic emission (AE) signals for the fault diagnosis and condition assessment instead of vibration signals, which has been advocated as an effective technique for identifying fracture, crack or damage. The AE signal has high frequencies up to several MHz which can avoid some signals interference, such as the parts of bearing (i.e. rolling elements, ring and so on) and other rotating parts of machine. However, acoustic emission signal necessitates advanced signal sampling capabilities and requests ability to deal with large amounts of sampling data. In this paper, compressive sensing (CS) is introduced as a processing framework, and then a compressive features extraction method is proposed. We use it for extracting the compressive features from compressively-sensed data directly, and also prove the energy preservation properties. First, we study the AE signals under the CS framework. The sparsity of AE signal of the rolling bearing is checked. The observation and reconstruction of signal is also studied. Second, we present a method of extraction AE compressive feature (AECF) from compressively-sensed data directly. We demonstrate the energy preservation properties and the processing of the extracted AECF feature. We assess the running state of the bearing using the AECF trend. The AECF trend of the running state of rolling bearings is consistent with the trend of traditional features. Thus, the method is an effective way to evaluate the running trend of rolling bearings. The results of the experiments have verified that the signal processing and the condition assessment based on AECF is simpler, the amount of data required is smaller, and the amount of computation is greatly reduced.

  9. Maximizing the return on taxpayers' investments in fundamental biomedical research

    PubMed Central

    Lorsch, Jon R.

    2015-01-01

    The National Institute of General Medical Sciences (NIGMS) at the U.S. National Institutes of Health has an annual budget of more than $2.3 billion. The institute uses these funds to support fundamental biomedical research and training at universities, medical schools, and other institutions across the country. My job as director of NIGMS is to work to maximize the scientific returns on the taxpayers' investments. I describe how we are optimizing our investment strategies and funding mechanisms, and how, in the process, we hope to create a more efficient and sustainable biomedical research enterprise. PMID:25926703

  10. Biomedical Informatics for Computer-Aided Decision Support Systems: A Survey

    PubMed Central

    Belle, Ashwin; Kon, Mark A.; Najarian, Kayvan

    2013-01-01

    The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest. PMID:23431259

  11. Professional Identification for Biomedical Engineers

    ERIC Educational Resources Information Center

    Long, Francis M.

    1973-01-01

    Discusses four methods of professional identification in biomedical engineering including registration, certification, accreditation, and possible membership qualification of the societies. Indicates that the destiny of the biomedical engineer may be under the control of a new profession, neither the medical nor the engineering. (CC)

  12. Dysphagia Screening: Contributions of Cervical Auscultation Signals and Modern Signal-Processing Techniques

    PubMed Central

    Dudik, Joshua M.; Coyle, James L.

    2015-01-01

    Cervical auscultation is the recording of sounds and vibrations caused by the human body from the throat during swallowing. While traditionally done by a trained clinician with a stethoscope, much work has been put towards developing more sensitive and clinically useful methods to characterize the data obtained with this technique. The eventual goal of the field is to improve the effectiveness of screening algorithms designed to predict the risk that swallowing disorders pose to individual patients’ health and safety. This paper provides an overview of these signal processing techniques and summarizes recent advances made with digital transducers in hopes of organizing the highly varied research on cervical auscultation. It investigates where on the body these transducers are placed in order to record a signal as well as the collection of analog and digital filtering techniques used to further improve the signal quality. It also presents the wide array of methods and features used to characterize these signals, ranging from simply counting the number of swallows that occur over a period of time to calculating various descriptive features in the time, frequency, and phase space domains. Finally, this paper presents the algorithms that have been used to classify this data into ‘normal’ and ‘abnormal’ categories. Both linear as well as non-linear techniques are presented in this regard. PMID:26213659

  13. An exploration of the biomedical optics course construction of undergraduate biomedical engineering program in medical colleges

    NASA Astrophysics Data System (ADS)

    Guo, Shijun; Lyu, Jie; Zhang, Peiming

    2017-08-01

    In this paper, the teaching goals, teaching contents and teaching methods in biomedical optics course construction are discussed. From the dimension of teaching goals, students should master the principle of optical inspection on the human body, diagnosis and treatment of methodology and instruments, through the study of the theory and practice of this course, and can utilize biomedical optics methods to solve practical problems in the clinical medical engineering practice. From the dimension of teaching contents, based on the characteristics of biomedical engineering in medical colleges, the organic integration of engineering aspects, medical optical instruments, and biomedical aspects dispersed in human anatomy, human physiology, clinical medicine fundamental related to the biomedical optics is build. Noninvasive measurement of the human body composition and noninvasive optical imaging of the human body were taken as actual problems in biomedical optics fields. Typical medical applications such as eye optics and laser medicine were also integrated into the theory and practice teaching. From the dimension of teaching methods, referencing to organ-system based medical teaching mode, optical principle and instrument principle were taught by teachers from school of medical instruments, and the histological characteristics and clinical actual need in areas such as digestive diseases and urinary surgery were taught by teachers from school of basic medicine or clinical medicine of medical colleges. Furthermore, clinical application guidance would be provided by physician and surgeons in hospitals.

  14. Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.

    NASA Astrophysics Data System (ADS)

    Wang, Avery Li-Chun

    This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters

  15. Integrated Kerr comb-based reconfigurable transversal differentiator for microwave photonic signal processing

    NASA Astrophysics Data System (ADS)

    Xu, Xingyuan; Wu, Jiayang; Shoeiby, Mehrdad; Nguyen, Thach G.; Chu, Sai T.; Little, Brent E.; Morandotti, Roberto; Mitchell, Arnan; Moss, David J.

    2018-01-01

    An arbitrary-order intensity differentiator for high-order microwave signal differentiation is proposed and experimentally demonstrated on a versatile transversal microwave photonic signal processing platform based on integrated Kerr combs. With a CMOS-compatible nonlinear micro-ring resonator, high quality Kerr combs with broad bandwidth and large frequency spacings are generated, enabling a larger number of taps and an increased Nyquist zone. By programming and shaping individual comb lines' power, calculated tap weights are realized, thus achieving a versatile microwave photonic signal processing platform. Arbitrary-order intensity differentiation is demonstrated on the platform. The RF responses are experimentally characterized, and systems demonstrations for Gaussian input signals are also performed.

  16. A Low-Power and Portable Biomedical Device for Respiratory Monitoring with a Stable Power Source

    PubMed Central

    Yang, Jiachen; Chen, Bobo; Zhou, Jianxiong; Lv, Zhihan

    2015-01-01

    Continuous respiratory monitoring is an important tool for clinical monitoring. Associated with the development of biomedical technology, it has become more and more important, especially in the measuring of gas flow and CO2 concentration, which can reflect the status of the patient. In this paper, a new type of biomedical device is presented, which uses low-power sensors with a piezoresistive silicon differential pressure sensor to measure gas flow and with a pyroelectric sensor to measure CO2 concentration simultaneously. For the portability of the biomedical device, the sensors and low-power measurement circuits are integrated together, and the airway tube also needs to be miniaturized. Circuits are designed to ensure the stability of the power source and to filter out the existing noise. Modulation technology is used to eliminate the fluctuations at the trough of the waveform of the CO2 concentration signal. Statistical analysis with the coefficient of variation was performed to find out the optimal driving voltage of the pressure transducer. Through targeted experiments, the biomedical device showed a high accuracy, with a measuring precision of 0.23 mmHg, and it worked continuously and stably, thus realizing the real-time monitoring of the status of patients. PMID:26270665

  17. Luminescent nanodiamonds for biomedical applications.

    PubMed

    Say, Jana M; van Vreden, Caryn; Reilly, David J; Brown, Louise J; Rabeau, James R; King, Nicholas J C

    2011-12-01

    In recent years, nanodiamonds have emerged from primarily an industrial and mechanical applications base, to potentially underpinning sophisticated new technologies in biomedical and quantum science. Nanodiamonds are relatively inexpensive, biocompatible, easy to surface functionalise and optically stable. This combination of physical properties are ideally suited to biological applications, including intracellular labelling and tracking, extracellular drug delivery and adsorptive detection of bioactive molecules. Here we describe some of the methods and challenges for processing nanodiamond materials, detection schemes and some of the leading applications currently under investigation.

  18. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  19. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  20. Signal processing in urodynamics: towards high definition urethral pressure profilometry.

    PubMed

    Klünder, Mario; Sawodny, Oliver; Amend, Bastian; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Stenzl, Arnulf; Feuer, Ronny

    2016-03-22

    Urethral pressure profilometry (UPP) is used in the diagnosis of stress urinary incontinence (SUI) which is a significant medical, social, and economic problem. Low spatial pressure resolution, common occurrence of artifacts, and uncertainties in data location limit the diagnostic value of UPP. To overcome these limitations, high definition urethral pressure profilometry (HD-UPP) combining enhanced UPP hardware and signal processing algorithms has been developed. In this work, we present the different signal processing steps in HD-UPP and show experimental results from female minipigs. We use a special microtip catheter with high angular pressure resolution and an integrated inclination sensor. Signals from the catheter are filtered and time-correlated artifacts removed. A signal reconstruction algorithm processes pressure data into a detailed pressure image on the urethra's inside. Finally, the pressure distribution on the urethra's outside is calculated through deconvolution. A mathematical model of the urethra is contained in a point-spread-function (PSF) which is identified depending on geometric and material properties of the urethra. We additionally investigate the PSF's frequency response to determine the relevant frequency band for pressure information on the urinary sphincter. Experimental pressure data are spatially located and processed into high resolution pressure images. Artifacts are successfully removed from data without blurring other details. The pressure distribution on the urethra's outside is reconstructed and compared to the one on the inside. Finally, the pressure images are mapped onto the urethral geometry calculated from inclination and position data to provide an integrated image of pressure distribution, anatomical shape, and location. With its advanced sensing capabilities, the novel microtip catheter collects an unprecedented amount of urethral pressure data. Through sequential signal processing steps, physicians are provided with

  1. Optical fibres in pre-detector signal processing

    NASA Astrophysics Data System (ADS)

    Flinn, A. R.

    The basic form of conventional electro-optic sensors is described. The main drawback of these sensors is their inability to deal with the background radiation which usually accompanies the signal. This 'clutter' limits the sensors performance long before other noise such as 'shot' noise. Pre-detector signal processing using the complex amplitude of the light is introduced as a means to discriminate between the signal and 'clutter'. Further improvements to predetector signal processors can be made by the inclusion of optical fibres allowing radiation to be used with greater efficiency and enabling certain signal processing tasks to be carried out with an ease unequalled by any other method. The theory of optical waveguides and their application in sensors, interferometers, and signal processors is reviewed. Geometrical aspects of the formation of linear and circular interference fringes are described along with temporal and spatial coherence theory and their relationship to Michelson's visibility function. The requirements for efficient coupling of a source into singlemode and multimode fibres are given. We describe interference experiments between beams of light emitted from a few metres of two or more, singlemode or multimode, optical fibres. Fresnel's equation is used to obtain expressions for Fresnel and Fraunhofer diffraction patterns which enable electro-optic (E-0) sensors to be analysed by Fourier optics. Image formation is considered when the aperture plane of an E-0 sensor is illuminated with partially coherent light. This allows sensors to be designed using optical transfer functions which are sensitive to the spatial coherence of the illuminating light. Spatial coherence sensors which use gratings as aperture plane reticles are discussed. By using fibre arrays, spatial coherence processing enables E-0 sensors to discriminate between a spatially coherent source and an incoherent background. The sensors enable the position and wavelength of the source to

  2. The Applications of Gold Nanoparticle-Initialed Chemiluminescence in Biomedical Detection

    NASA Astrophysics Data System (ADS)

    Liu, Zezhong; Zhao, Furong; Gao, Shandian; Shao, Junjun; Chang, Huiyun

    2016-10-01

    Chemiluminescence technique as a novel detection method has gained much attention in recent years owning to the merits of high sensitivity, wider linear ranges, and low background signal. Similarly, nanotechnology especially for gold nanoparticles has emerged as detection tools due to their unique physical and chemical properties. Recently, it has become increasingly popular to couple gold nanoparticles with chemiluminescence technique in biological agents' detection. In this review, we describe the superiority of both chemiluminescence and gold nanoparticles and conclude the different applications of gold nanoparticle-initialed chemiluminescence in biomedical detection.

  3. Exploiting semantic patterns over biomedical knowledge graphs for predicting treatment and causative relations.

    PubMed

    Bakal, Gokhan; Talari, Preetham; Kakani, Elijah V; Kavuluru, Ramakanth

    2018-06-01

    Identifying new potential treatment options for medical conditions that cause human disease burden is a central task of biomedical research. Since all candidate drugs cannot be tested with animal and clinical trials, in vitro approaches are first attempted to identify promising candidates. Likewise, identifying different causal relations between biomedical entities is also critical to understand biomedical processes. Generally, natural language processing (NLP) and machine learning are used to predict specific relations between any given pair of entities using the distant supervision approach. To build high accuracy supervised predictive models to predict previously unknown treatment and causative relations between biomedical entities based only on semantic graph pattern features extracted from biomedical knowledge graphs. We used 7000 treats and 2918 causes hand-curated relations from the UMLS Metathesaurus to train and test our models. Our graph pattern features are extracted from simple paths connecting biomedical entities in the SemMedDB graph (based on the well-known SemMedDB database made available by the U.S. National Library of Medicine). Using these graph patterns connecting biomedical entities as features of logistic regression and decision tree models, we computed mean performance measures (precision, recall, F-score) over 100 distinct 80-20% train-test splits of the datasets. For all experiments, we used a positive:negative class imbalance of 1:10 in the test set to model relatively more realistic scenarios. Our models predict treats and causes relations with high F-scores of 99% and 90% respectively. Logistic regression model coefficients also help us identify highly discriminative patterns that have an intuitive interpretation. We are also able to predict some new plausible relations based on false positives that our models scored highly based on our collaborations with two physician co-authors. Finally, our decision tree models are able to retrieve

  4. Can gamma irradiation during radiotherapy influence the metal release process for biomedical CoCrMo and 316L alloys?

    PubMed

    Wei, Zheng; Edin, Jonathan; Karlsson, Anna Emelie; Petrovic, Katarina; Soroka, Inna L; Odnevall Wallinder, Inger; Hedberg, Yolanda

    2018-02-09

    The extent of metal release from implant materials that are irradiated during radiotherapy may be influenced by irradiation-formed radicals. The influence of gamma irradiation, with a total dose of relevance for radiotherapy (e.g., for cancer treatments) on the extent of metal release from biomedical stainless steel AISI 316L and a cobalt-chromium alloy (CoCrMo) was investigated in physiological relevant solutions (phosphate buffered saline with and without 10 g/L bovine serum albumin) at pH 7.3. Directly after irradiation, the released amounts of metals were significantly higher for irradiated CoCrMo as compared to nonirradiated CoCrMo, resulting in an increased surface passivation (enhanced passive conditions) that hindered further release. A similar effect was observed for 316L showing lower nickel release after 1 h of initially irradiated samples as compared to nonirradiated samples. However, the effect of irradiation (total dose of 16.5 Gy) on metal release and surface oxide composition and thickness was generally small. Most metals were released initially (within seconds) upon immersion from CoCrMo but not from 316L. Albumin induced an increased amount of released metals from AISI 316L but not from CoCrMo. Albumin was not found to aggregate to any greater extent either upon gamma irradiation or in the presence of trace metal ions, as determined using different light scattering techniques. Further studies should elucidate the effect of repeated friction and fractionated low irradiation doses on the short- and long term metal release process of biomedical materials. © 2018 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2018. © 2018 The Authors Journal of Biomedical Materials Research Part B: Applied Biomaterials Published by Wiley Periodicals, Inc.

  5. Current status of biomedical book reviewing. I. Key biomedical reviewing journals with quantitative significance.

    PubMed

    Chen, C C; Wright, A M

    1974-04-01

    This is the first part of a comprehensive, quantitative study of biomedical book reviewing. The data base of the total project was built from statistics taken from all 1970 issues of biomedical journals held in the Science Library of the Massachusetts Institute of Technology. Of 285 so-called "life sciences" journals held by that library, fifty-four English journals (excluding Science and Nature) were found to contain bona fide book reviews (as contrasted with mere author-title lists) and were therefore selected for close study. The statistical results reveal that there were 3,347 reviews of 2,067 biomedical books in these fifty-four selected journals in 1970. Part I of the study identifies key biomedical reviewing journals of quantitative significance. The top ten journals, British Medical Journal, Lancet, Annals of Internal Medicine, Journal of the American Medical Association, Archives of Internal Medicine, New England Journal of Medicine, Quarterly Review of Biology, Bioscience, Canadian Medical Association Journal,(*) and American Journal of the Medical Sciences, accounted for 63.03% of the total number of reviews in 1970.

  6. Implementation and optimization of ultrasound signal processing algorithms on mobile GPU

    NASA Astrophysics Data System (ADS)

    Kong, Woo Kyu; Lee, Wooyoul; Kim, Kyu Cheol; Yoo, Yangmo; Song, Tai-Kyong

    2014-03-01

    A general-purpose graphics processing unit (GPGPU) has been used for improving computing power in medical ultrasound imaging systems. Recently, a mobile GPU becomes powerful to deal with 3D games and videos at high frame rates on Full HD or HD resolution displays. This paper proposes the method to implement ultrasound signal processing on a mobile GPU available in the high-end smartphone (Galaxy S4, Samsung Electronics, Seoul, Korea) with programmable shaders on the OpenGL ES 2.0 platform. To maximize the performance of the mobile GPU, the optimization of shader design and load sharing between vertex and fragment shader was performed. The beamformed data were captured from a tissue mimicking phantom (Model 539 Multipurpose Phantom, ATS Laboratories, Inc., Bridgeport, CT, USA) by using a commercial ultrasound imaging system equipped with a research package (Ultrasonix Touch, Ultrasonix, Richmond, BC, Canada). The real-time performance is evaluated by frame rates while varying the range of signal processing blocks. The implementation method of ultrasound signal processing on OpenGL ES 2.0 was verified by analyzing PSNR with MATLAB gold standard that has the same signal path. CNR was also analyzed to verify the method. From the evaluations, the proposed mobile GPU-based processing method has no significant difference with the processing using MATLAB (i.e., PSNR<52.51 dB). The comparable results of CNR were obtained from both processing methods (i.e., 11.31). From the mobile GPU implementation, the frame rates of 57.6 Hz were achieved. The total execution time was 17.4 ms that was faster than the acquisition time (i.e., 34.4 ms). These results indicate that the mobile GPU-based processing method can support real-time ultrasound B-mode processing on the smartphone.

  7. [Biomedical engineering today : An overview from the viewpoint of the German Biomedical Engineering Society].

    PubMed

    Schlötelburg, C; Becks, T; Stieglitz, T

    2010-08-01

    Biomedical engineering is characterized by the interdisciplinary co-operation of technology, science, and ways of thinking, probably more than any other technological area. The close interaction of engineering and information sciences with medicine and biology results in innovative products and methods, but also requires high standards for the interdisciplinary transfer of ideas into products for patients' benefits. This article describes the situation of biomedical engineering in Germany. It displays characteristics of the medical device industry and ranks it with respect to the international market. The research landscape is described as well as up-to-date research topics and trends. The national funding situation of research in biomedical engineering is reviewed and existing innovation barriers are discussed.

  8. Signal processing for passive detection and classification of underwater acoustic signals

    NASA Astrophysics Data System (ADS)

    Chung, Kil Woo

    2011-12-01

    This dissertation examines signal processing for passive detection, classification and tracking of underwater acoustic signals for improving port security and the security of coastal and offshore operations. First, we consider the problem of passive acoustic detection of a diver in a shallow water environment. A frequency-domain multi-band matched-filter approach to swimmer detection is presented. The idea is to break the frequency contents of the hydrophone signals into multiple narrow frequency bands, followed by time averaged (about half of a second) energy calculation over each band. Then, spectra composed of such energy samples over the chosen frequency bands are correlated to form a decision variable. The frequency bands with highest Signal/Noise ratio are used for detection. The performance of the proposed approach is demonstrated for experimental data collected for a diver in the Hudson River. We also propose a new referenceless frequency-domain multi-band detector which, unlike other reference-based detectors, does not require a diver specific signature. Instead, our detector matches to a general feature of the diver spectrum in the high frequency range: the spectrum is roughly periodic in time and approximately flat when the diver exhales. The performance of the proposed approach is demonstrated by using experimental data collected from the Hudson River. Moreover, we present detection, classification and tracking of small vessel signals. Hydroacoustic sensors can be applied for the detection of noise generated by vessels, and this noise can be used for vessel detection, classification and tracking. This dissertation presents recent improvements aimed at the measurement and separation of ship DEMON (Detection of Envelope Modulation on Noise) acoustic signatures in busy harbor conditions. Ship signature measurements were conducted in the Hudson River and NY Harbor. The DEMON spectra demonstrated much better temporal stability compared with the full ship

  9. Parametric Techniques for Multichannel Signal Processing.

    DTIC Science & Technology

    1985-10-01

    AD-A165 649 PARAMETRIC TECHNIQUES FOR MULTICHANNEL SIGNAL PROCESSING(U) SYSTEM CONTROL TECHNOLOGY INC PALO RLTO CA B FRIEDLANDER OCT 85 5498-87 RRO...CONTRACT NO. DAAG29-83-C-0027 SYSTEMS CONTROL TECHNOLOGY, INC. DT1I? q4 1801 PAGE MILL ROAD ELI PALO ALTO, CALIFORNIA 94304EL C MAR 193 £4 APPROVED FOR...PROJECT, TASK Systems Control Technology, Inc. AREA & WORK UNIT NUMBERS 1801 Page Mill Road Palo Alto, CA 94304 II :ON?’ROLLING OFFICE NAME AND

  10. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1984-10-01

    I 1.8 IIII III1 1 / U , 0 7 USCIPI Report 1130 E ~C~,OUTfitA N Ivj) UNIVERSITY OF SOUTHERN CALIFORNIA - I FINAL TECHNICAL REPORT April 15, 1981 - June...30, 1984 N NONLINEAR REAL-TIME OPTICAL SIGNAL PROCESSING i E ~ A.A. Sawchuk, Principal Investigator T.C. Strand and A.R. Tanguay. Jr. October 1, 1984...Erter.d) logic system. A computer generated hologram fabricated on an e -beam system serves as a beamsteering interconnection element. A completely

  11. A digital signal processing system for coherent laser radar

    NASA Technical Reports Server (NTRS)

    Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry

    1991-01-01

    A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.

  12. DataMed - an open source discovery index for finding biomedical datasets.

    PubMed

    Chen, Xiaoling; Gururaj, Anupama E; Ozyurt, Burak; Liu, Ruiling; Soysal, Ergin; Cohen, Trevor; Tiryaki, Firat; Li, Yueling; Zong, Nansu; Jiang, Min; Rogith, Deevakar; Salimi, Mandana; Kim, Hyeon-Eui; Rocca-Serra, Philippe; Gonzalez-Beltran, Alejandra; Farcas, Claudiu; Johnson, Todd; Margolis, Ron; Alter, George; Sansone, Susanna-Assunta; Fore, Ian M; Ohno-Machado, Lucila; Grethe, Jeffrey S; Xu, Hua

    2018-01-13

    Finding relevant datasets is important for promoting data reuse in the biomedical domain, but it is challenging given the volume and complexity of biomedical data. Here we describe the development of an open source biomedical data discovery system called DataMed, with the goal of promoting the building of additional data indexes in the biomedical domain. DataMed, which can efficiently index and search diverse types of biomedical datasets across repositories, is developed through the National Institutes of Health-funded biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium. It consists of 2 main components: (1) a data ingestion pipeline that collects and transforms original metadata information to a unified metadata model, called DatA Tag Suite (DATS), and (2) a search engine that finds relevant datasets based on user-entered queries. In addition to describing its architecture and techniques, we evaluated individual components within DataMed, including the accuracy of the ingestion pipeline, the prevalence of the DATS model across repositories, and the overall performance of the dataset retrieval engine. Our manual review shows that the ingestion pipeline could achieve an accuracy of 90% and core elements of DATS had varied frequency across repositories. On a manually curated benchmark dataset, the DataMed search engine achieved an inferred average precision of 0.2033 and a precision at 10 (P@10, the number of relevant results in the top 10 search results) of 0.6022, by implementing advanced natural language processing and terminology services. Currently, we have made the DataMed system publically available as an open source package for the biomedical community. © The Author 2018. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. BIOSSES: a semantic sentence similarity estimation system for the biomedical domain.

    PubMed

    Sogancioglu, Gizem; Öztürk, Hakime; Özgür, Arzucan

    2017-07-15

    The amount of information available in textual format is rapidly increasing in the biomedical domain. Therefore, natural language processing (NLP) applications are becoming increasingly important to facilitate the retrieval and analysis of these data. Computing the semantic similarity between sentences is an important component in many NLP tasks including text retrieval and summarization. A number of approaches have been proposed for semantic sentence similarity estimation for generic English. However, our experiments showed that such approaches do not effectively cover biomedical knowledge and produce poor results for biomedical text. We propose several approaches for sentence-level semantic similarity computation in the biomedical domain, including string similarity measures and measures based on the distributed vector representations of sentences learned in an unsupervised manner from a large biomedical corpus. In addition, ontology-based approaches are presented that utilize general and domain-specific ontologies. Finally, a supervised regression based model is developed that effectively combines the different similarity computation metrics. A benchmark data set consisting of 100 sentence pairs from the biomedical literature is manually annotated by five human experts and used for evaluating the proposed methods. The experiments showed that the supervised semantic sentence similarity computation approach obtained the best performance (0.836 correlation with gold standard human annotations) and improved over the state-of-the-art domain-independent systems up to 42.6% in terms of the Pearson correlation metric. A web-based system for biomedical semantic sentence similarity computation, the source code, and the annotated benchmark data set are available at: http://tabilab.cmpe.boun.edu.tr/BIOSSES/ . gizemsogancioglu@gmail.com or arzucan.ozgur@boun.edu.tr. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e

  14. Special Issue: 3D Printing for Biomedical Engineering.

    PubMed

    Chua, Chee Kai; Yeong, Wai Yee; An, Jia

    2017-02-28

    Three-dimensional (3D) printing has a long history of applications in biomedical engineering. The development and expansion of traditional biomedical applications are being advanced and enriched by new printing technologies. New biomedical applications such as bioprinting are highly attractive and trendy. This Special Issue aims to provide readers with a glimpse of the recent profile of 3D printing in biomedical research.

  15. Character-level neural network for biomedical named entity recognition.

    PubMed

    Gridach, Mourad

    2017-06-01

    Biomedical named entity recognition (BNER), which extracts important named entities such as genes and proteins, is a challenging task in automated systems that mine knowledge in biomedical texts. The previous state-of-the-art systems required large amounts of task-specific knowledge in the form of feature engineering, lexicons and data pre-processing to achieve high performance. In this paper, we introduce a novel neural network architecture that benefits from both word- and character-level representations automatically, by using a combination of bidirectional long short-term memory (LSTM) and conditional random field (CRF) eliminating the need for most feature engineering tasks. We evaluate our system on two datasets: JNLPBA corpus and the BioCreAtIvE II Gene Mention (GM) corpus. We obtained state-of-the-art performance by outperforming the previous systems. To the best of our knowledge, we are the first to investigate the combination of deep neural networks, CRF, word embeddings and character-level representation in recognizing biomedical named entities. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  17. Commercial Biomedical Experiments

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Experiments to seek solutions for a range of biomedical issues are at the heart of several investigations that will be hosted by the Commercial Instrumentation Technology Associates (ITA), Inc. Biomedical Experiments (CIBX-2) payload. CIBX-2 is unique, encompassing more than 20 separate experiments including cancer research, commercial experiments, and student hands-on experiments from 10 schools as part of ITA's ongoing University Among the Stars program. Valerie Cassanto of ITA checks the Canadian Protein Crystallization Experiment (CAPE) carried by STS-86 to Mir in 1997. The experiments are sponsored by NASA's Space Product Development Program (SPD).

  18. Recent development and biomedical applications of self-healing hydrogels.

    PubMed

    Wang, Yinan; Adokoh, Christian K; Narain, Ravin

    2018-01-01

    Hydrogels are of special importance, owing to their high-water content and various applications in biomedical and bio-engineering research. Self-healing properties is a common phenomenon in living organisms. Their endowed property of being able to self-repair after physical/chemical/mechanical damage to fully or partially its original properties demonstrates their prospective therapeutic applications. Due to complicated preparation and selection of suitable materials, the application of many host-guest supramolecular polymeric hydrogels are so limited. Thus, the design and construction of self-repairing material are highly desirable for effectively increase in the lifetime of a functional material. However, recent advances in the field of materials science and bioengineering and nanotechnology have led to the design of biologically relevant self-healing hydrogels for therapeutic applications. This review focuses on the recent development of self-healing hydrogels for biomedical application. Areas covered: The strategies of making self-healing hydrogels and their healing mechanisms are discussed. The significance of self-healing hydrogel for biomedical application is also highlighted in areas such as 3D/4D printing, cell/drug delivery, as well as soft actuators. Expert opinion: Materials that have the ability to self-repair damage and regain the desired mechanical properties, have been found to be excellent candidate materials for a range of biomedical uses especially if their unique characteristics are similar to that of soft-tissues. Self-healing hydrogels have been synthesized and shown to exhibit similar characteristics as human tissues, however, significant improvement is required in the fabrication process from inexpensive and nontoxic/non-hazardous materials and techniques, and, in addition, further fine-tuning of the self-healing properties are needed for specific biomedical uses.

  19. Jitter model and signal processing techniques for pulse width modulation optical recording

    NASA Technical Reports Server (NTRS)

    Liu, Max M.-K.

    1991-01-01

    A jitter model and signal processing techniques are discussed for data recovery in Pulse Width Modulation (PWM) optical recording. In PWM, information is stored through modulating sizes of sequential marks alternating in magnetic polarization or in material structure. Jitter, defined as the deviation from the original mark size in the time domain, will result in error detection if it is excessively large. A new approach is taken in data recovery by first using a high speed counter clock to convert time marks to amplitude marks, and signal processing techniques are used to minimize jitter according to the jitter model. The signal processing techniques include motor speed and intersymbol interference equalization, differential and additive detection, and differential and additive modulation.

  20. P-Code-Enhanced Encryption-Mode Processing of GPS Signals

    NASA Technical Reports Server (NTRS)

    Young, Lawrence; Meehan, Thomas; Thomas, Jess B.

    2003-01-01

    A method of processing signals in a Global Positioning System (GPS) receiver has been invented to enable the receiver to recover some of the information that is otherwise lost when GPS signals are encrypted at the transmitters. The need for this method arises because, at the option of the military, precision GPS code (P-code) is sometimes encrypted by a secret binary code, denoted the A code. Authorized users can recover the full signal with knowledge of the A-code. However, even in the absence of knowledge of the A-code, one can track the encrypted signal by use of an estimate of the A-code. The present invention is a method of making and using such an estimate. In comparison with prior such methods, this method makes it possible to recover more of the lost information and obtain greater accuracy.