Sample records for signal processing tool

  1. pySPACE—a signal processing and classification environment in Python

    PubMed Central

    Krell, Mario M.; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H.; Kirchner, Elsa A.; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries. PMID:24399965

  2. pySPACE-a signal processing and classification environment in Python.

    PubMed

    Krell, Mario M; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H; Kirchner, Elsa A; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.

  3. Audio signal analysis for tool wear monitoring in sheet metal stamping

    NASA Astrophysics Data System (ADS)

    Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.

    2017-02-01

    Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.

  4. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  5. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  6. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  7. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  8. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  9. Tool Condition Monitoring in Micro-End Milling using wavelets

    NASA Astrophysics Data System (ADS)

    Dubey, N. K.; Roushan, A.; Rao, U. S.; Sandeep, K.; Patra, K.

    2018-04-01

    In this work, Tool Condition Monitoring (TCM) strategy is developed for micro-end milling of titanium alloy and mild steel work-pieces. Full immersion slot milling experiments are conducted using a solid tungsten carbide end mill for more than 1900 s to have reasonable amount of tool wear. During the micro-end milling process, cutting force and vibration signals are acquired using Kistler piezo-electric 3-component force dynamometer (9256C2) and accelerometer (NI cDAQ-9188) respectively. The force components and the vibration signals are processed using Discrete Wavelet Transformation (DWT) in both time and frequency window. 5-level wavelet packet decomposition using Db-8 wavelet is carried out and the detailed coefficients D1 to D5 for each of the signals are obtained. The results of the wavelet transformation are correlated with the tool wear. In case of vibration signals, de-noising is done for higher frequency components (D1) and force signals were de-noised for lower frequency components (D5). Increasing value of MAD (Mean Absolute Deviation) of the detail coefficients for successive channels depicted tool wear. The predictions of the tool wear are confirmed from the actual wear observed in the SEM of the worn tool.

  10. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    PubMed

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-06-01

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul; Lammlein, David; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David; Hartman, Daniel A.

    2010-12-14

    A friction stir system for processing at least a first workpiece includes a spindle actuator coupled to a rotary tool comprising a rotating member for contacting and processing the first workpiece. A detection system is provided for obtaining information related to a lateral alignment of the rotating member. The detection system comprises at least one sensor for measuring a force experienced by the rotary tool or a parameter related to the force experienced by the rotary tool during processing, wherein the sensor provides sensor signals. A signal processing system is coupled to receive and analyze the sensor signals and determine a lateral alignment of the rotating member relative to a selected lateral position, a selected path, or a direction to decrease a lateral distance relative to the selected lateral position or selected path. In one embodiment, the friction stir system can be embodied as a closed loop tracking system, such as a robot-based tracked friction stir welding (FSW) or friction stir processing (FSP) system.

  12. On-line Tool Wear Detection on DCMT070204 Carbide Tool Tip Based on Noise Cutting Audio Signal using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.

    2018-01-01

    This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.

  13. The technique of entropy optimization in motor current signature analysis and its application in the fault diagnosis of gear transmission

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong

    2012-05-01

    Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.

  14. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  15. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  16. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  17. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  18. Acoustic emission detection of macro-cracks on engraving tool steel inserts during the injection molding cycle using PZT sensors.

    PubMed

    Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez

    2013-05-14

    This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process.

  19. Acoustic Emission Detection of Macro-Cracks on Engraving Tool Steel Inserts during the Injection Molding Cycle Using PZT Sensors

    PubMed Central

    Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez

    2013-01-01

    This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process. PMID:23673677

  20. Cyclostationarity approach for monitoring chatter and tool wear in high speed milling

    NASA Astrophysics Data System (ADS)

    Lamraoui, M.; Thomas, M.; El Badaoui, M.

    2014-02-01

    Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.

  1. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  2. On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters

    NASA Astrophysics Data System (ADS)

    Han, Fenghua; Xie, Feng

    2017-07-01

    In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.

  3. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  4. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    NASA Astrophysics Data System (ADS)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  5. Time-frequency analysis of pediatric murmurs

    NASA Astrophysics Data System (ADS)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  6. Interactive Digital Signal Processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1985-01-01

    Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.

  7. Biomedical signal and image processing.

    PubMed

    Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro

    2011-01-01

    Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.

  8. Cancer systems biology: signal processing for cancer research

    PubMed Central

    Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei

    2011-01-01

    In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts. PMID:21439242

  9. Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Richard A.; Radford, David C.

    2013-12-30

    Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arraysmore » such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.« less

  10. Study of the time and effort signal in cutting operations

    NASA Astrophysics Data System (ADS)

    Grosset, E.; Maillard, A.; Bouhelier, C.; Gasnier, J.

    1990-02-01

    Perception and treatment of an effort signal by computer methods is discussed. An automatic control system used to measure the wear of machine tools and carry out quality control throughout the cutting process is described. The testing system is used to evaluate the performance of tools which have been vacuum plated. The system is used as part of the BRITE study, the goal of which is to develop an expert system for measuring the wear of tools used during drilling and perforation operations.

  11. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  12. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  13. Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Darren

    2004-07-01

    MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefrontsmore » at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.« less

  14. Time series analysis of tool wear in sheet metal stamping using acoustic emission

    NASA Astrophysics Data System (ADS)

    Vignesh Shanbhag, V.; Pereira, P. Michael; Rolfe, F. Bernard; Arunachalam, N.

    2017-09-01

    Galling is an adhesive wear mode that often affects the lifespan of stamping tools. Since stamping tools represent significant economic cost, even a slight improvement in maintenance cost is of high importance for the stamping industry. In other manufacturing industries, online tool condition monitoring has been used to prevent tool wear-related failure. However, monitoring the acoustic emission signal from a stamping process is a non-trivial task since the acoustic emission signal is non-stationary and non-transient. There have been numerous studies examining acoustic emissions in sheet metal stamping. However, very few have focused in detail on how the signals change as wear on the tool surface progresses prior to failure. In this study, time domain analysis was applied to the acoustic emission signals to extract features related to tool wear. To understand the wear progression, accelerated stamping tests were performed using a semi-industrial stamping setup which can perform clamping, piercing, stamping in a single cycle. The time domain features related to stamping were computed for the acoustic emissions signal of each part. The sidewalls of the stamped parts were scanned using an optical profilometer to obtain profiles of the worn part, and they were qualitatively correlated to that of the acoustic emissions signal. Based on the wear behaviour, the wear data can be divided into three stages: - In the first stage, no wear is observed, in the second stage, adhesive wear is likely to occur, and in the third stage severe abrasive plus adhesive wear is likely to occur. Scanning electron microscopy showed the formation of lumps on the stamping tool, which represents galling behavior. Correlation between the time domain features of the acoustic emissions signal and the wear progression identified in this study lays the basis for tool diagnostics in stamping industry.

  15. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  16. Adaptive Signal Processing Testbed: VME-based DSP board market survey

    NASA Astrophysics Data System (ADS)

    Ingram, Rick E.

    1992-04-01

    The Adaptive Signal Processing Testbed (ASPT) is a real-time multiprocessor system utilizing digital signal processor technology on VMEbus based printed circuit boards installed on a Sun workstation. The ASPT has specific requirements, particularly as regards to the signal excision application, with respect to interfacing with current and planned data generation equipment, processing of the data, storage to disk of final and intermediate results, and the development tools for applications development and integration into the overall EW/COM computing environment. A prototype ASPT was implemented using three VME-C-30 boards from Applied Silicon. Experience gained during the prototype development led to the conclusions that interprocessor communications capability is the most significant contributor to overall ASPT performance. In addition, the host involvement should be minimized. Boards using different processors were evaluated with respect to the ASPT system requirements, pricing, and availability. Specific recommendations based on various priorities are made as well as recommendations concerning the integration and interaction of various tools developed during the prototype implementation.

  17. Evaluation of interaction dynamics of concurrent processes

    NASA Astrophysics Data System (ADS)

    Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas

    2017-03-01

    The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.

  18. Man-Machine Interface System for Neuromuscular Training and Evaluation Based on EMG and MMG Signals

    PubMed Central

    de la Rosa, Ramon; Alonso, Alonso; Carrera, Albano; Durán, Ramon; Fernández, Patricia

    2010-01-01

    This paper presents the UVa-NTS (University of Valladolid Neuromuscular Training System), a multifunction and portable Neuromuscular Training System. The UVa-NTS is designed to analyze the voluntary control of severe neuromotor handicapped patients, their interactive response, and their adaptation to neuromuscular interface systems, such as neural prostheses or domotic applications. Thus, it is an excellent tool to evaluate the residual muscle capabilities in the handicapped. The UVa-NTS is composed of a custom signal conditioning front-end and a computer. The front-end electronics is described thoroughly as well as the overall features of the custom software implementation. The software system is composed of a set of graphical training tools and a processing core. The UVa-NTS works with two classes of neuromuscular signals: the classic myoelectric signals (MES) and, as a novelty, the myomechanic signals (MMS). In order to evaluate the performance of the processing core, a complete analysis has been done to classify its efficiency and to check that it fulfils with the real-time constraints. Tests were performed both with healthy and selected impaired subjects. The adaptation was achieved rapidly, applying a predefined protocol for the UVa-NTS set of training tools. Fine voluntary control was demonstrated to be reached with the myoelectric signals. And the UVa-NTS demonstrated to provide a satisfactory voluntary control when applying the myomechanic signals. PMID:22163515

  19. Man-machine interface system for neuromuscular training and evaluation based on EMG and MMG signals.

    PubMed

    de la Rosa, Ramon; Alonso, Alonso; Carrera, Albano; Durán, Ramon; Fernández, Patricia

    2010-01-01

    This paper presents the UVa-NTS (University of Valladolid Neuromuscular Training System), a multifunction and portable Neuromuscular Training System. The UVa-NTS is designed to analyze the voluntary control of severe neuromotor handicapped patients, their interactive response, and their adaptation to neuromuscular interface systems, such as neural prostheses or domotic applications. Thus, it is an excellent tool to evaluate the residual muscle capabilities in the handicapped. The UVa-NTS is composed of a custom signal conditioning front-end and a computer. The front-end electronics is described thoroughly as well as the overall features of the custom software implementation. The software system is composed of a set of graphical training tools and a processing core. The UVa-NTS works with two classes of neuromuscular signals: the classic myoelectric signals (MES) and, as a novelty, the myomechanic signals (MMS). In order to evaluate the performance of the processing core, a complete analysis has been done to classify its efficiency and to check that it fulfils with the real-time constraints. Tests were performed both with healthy and selected impaired subjects. The adaptation was achieved rapidly, applying a predefined protocol for the UVa-NTS set of training tools. Fine voluntary control was demonstrated to be reached with the myoelectric signals. And the UVa-NTS demonstrated to provide a satisfactory voluntary control when applying the myomechanic signals.

  20. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  1. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    NASA Astrophysics Data System (ADS)

    D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina

    2016-02-01

    In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  2. miRnalyze: an interactive database linking tool to unlock intuitive microRNA regulation of cell signaling pathways

    PubMed Central

    Subhra Das, Sankha; James, Mithun; Paul, Sandip

    2017-01-01

    Abstract The various pathophysiological processes occurring in living systems are known to be orchestrated by delicate interplays and cross-talks between different genes and their regulators. Among the various regulators of genes, there is a class of small non-coding RNA molecules known as microRNAs. Although, the relative simplicity of miRNAs and their ability to modulate cellular processes make them attractive therapeutic candidates, their presence in large numbers make it challenging for experimental researchers to interpret the intricacies of the molecular processes they regulate. Most of the existing bioinformatic tools fail to address these challenges. Here, we present a new web resource ‘miRnalyze’ that has been specifically designed to directly identify the putative regulation of cell signaling pathways by miRNAs. The tool integrates miRNA-target predictions with signaling cascade members by utilizing TargetScanHuman 7.1 miRNA-target prediction tool and the KEGG pathway database, and thus provides researchers with in-depth insights into modulation of signal transduction pathways by miRNAs. miRnalyze is capable of identifying common miRNAs targeting more than one gene in the same signaling pathway—a feature that further increases the probability of modulating the pathway and downstream reactions when using miRNA modulators. Additionally, miRnalyze can sort miRNAs according to the seed-match types and TargetScan Context ++ score, thus providing a hierarchical list of most valuable miRNAs. Furthermore, in order to provide users with comprehensive information regarding miRNAs, genes and pathways, miRnalyze also links to expression data of miRNAs (miRmine) and genes (TiGER) and proteome abundance (PaxDb) data. To validate the capability of the tool, we have documented the correlation of miRnalyze’s prediction with experimental confirmation studies. Database URL: http://www.mirnalyze.in PMID:28365733

  3. Non Destructive Analysis of Fsw Welds using Ultrasonic Signal Analysis

    NASA Astrophysics Data System (ADS)

    Pavan Kumar, T.; Prabhakar Reddy, P.

    2017-08-01

    Friction Stir Welding is an evolving metal joining technique and is mostly used in joining materials which cannot be easily joined by other available welding techniques. It is a technique which can be used for welding dissimilar materials also. The strength of the weld joint is determined by the way in which these material are mixing with each other, since we are not using any filler material for the welding process the intermixing has a significant importance. The complication with the friction stir welding process is that there are many process parameters which effect this intermixing process such as tool geometry, rotating speed of the tool, transverse speed etc., In this study an attempt is made to compare the material flow and weld quality of various weldments by changing the parameters. Ultrasonic signal Analysis is used to characterize the microstructure of the weldments. use of ultrasonic waves is a non destructive, accurate and fast way of characterization of microstructure. In this method the relationship between the ultrasonic measured parameters and microstructures are evaluated using background echo and backscattered signal process techniques. The ultrasonic velocity and attenuation measurements are dependent on the elastic modulus and any change in the microstructure is reflected in the ultrasonic velocity. An insight into material flow is essential to determine the quality of the weld. Hence an attempt is made in this study to know the relationship between tool geometry and the pattern of material flow and resulting weld quality the experiments are conducted to weld dissimilar aluminum alloys and the weldments are characterized using and ultra Sonic signal processing. Characterization is also done using Scanning Electron Microscopy. It is observed that there is a good correlation between the ultrasonic signal processing results and Scanning Electron Microscopy on the observed precipitates. Tensile tests and hardness tests are conducted on the weldments and compared for determining the weld quality.

  4. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    PubMed

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  5. Genomic Signal Processing Methods for Computation of Alignment-Free Distances from DNA Sequences

    PubMed Central

    Borrayo, Ernesto; Mendizabal-Ruiz, E. Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P.; Morales, J. Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments. PMID:25393409

  6. Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum

    NASA Astrophysics Data System (ADS)

    Guan, Shan; Song, Weijie; Pang, Hongyang

    2017-09-01

    In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.

  7. Advanced Signal Processing for High Temperatures Health Monitoring of Condensed Water Height in Steam Pipes

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi

    2013-01-01

    An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.

  8. Software for biomedical engineering signal processing laboratory experiments.

    PubMed

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  9. BioSig: The Free and Open Source Software Library for Biomedical Signal Processing

    PubMed Central

    Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227

  10. BioSig: the free and open source software library for biomedical signal processing.

    PubMed

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  11. Contextual signals in visual cortex.

    PubMed

    Khan, Adil G; Hofer, Sonja B

    2018-06-05

    Vision is an active process. What we perceive strongly depends on our actions, intentions and expectations. During visual processing, these internal signals therefore need to be integrated with the visual information from the retina. The mechanisms of how this is achieved by the visual system are still poorly understood. Advances in recording and manipulating neuronal activity in specific cell types and axonal projections together with tools for circuit tracing are beginning to shed light on the neuronal circuit mechanisms of how internal, contextual signals shape sensory representations. Here we review recent work, primarily in mice, that has advanced our understanding of these processes, focusing on contextual signals related to locomotion, behavioural relevance and predictions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Smartphone Cortex Controlled Real-Time Image Processing and Reprocessing for Concentration Independent LED Induced Fluorescence Detection in Capillary Electrophoresis.

    PubMed

    Szarka, Mate; Guttman, Andras

    2017-10-17

    We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.

  13. Condition monitoring of turning process using infrared thermography technique - An experimental approach

    NASA Astrophysics Data System (ADS)

    Prasad, Balla Srinivasa; Prabha, K. Aruna; Kumar, P. V. S. Ganesh

    2017-03-01

    In metal cutting machining, major factors that affect the cutting tool life are machine tool vibrations, tool tip/chip temperature and surface roughness along with machining parameters like cutting speed, feed rate, depth of cut, tool geometry, etc., so it becomes important for the manufacturing industry to find the suitable levels of process parameters for obtaining maintaining tool life. Heat generation in cutting was always a main topic to be studied in machining. Recent advancement in signal processing and information technology has resulted in the use of multiple sensors for development of the effective monitoring of tool condition monitoring systems with improved accuracy. From a process improvement point of view, it is definitely more advantageous to proactively monitor quality directly in the process instead of the product, so that the consequences of a defective part can be minimized or even eliminated. In the present work, a real time process monitoring method is explored using multiple sensors. It focuses on the development of a test bed for monitoring the tool condition in turning of AISI 316L steel by using both coated and uncoated carbide inserts. Proposed tool condition monitoring (TCM) is evaluated in the high speed turning using multiple sensors such as Laser Doppler vibrometer and infrared thermography technique. The results indicate the feasibility of using the dominant frequency of the vibration signals for the monitoring of high speed turning operations along with temperatures gradient. A possible correlation is identified in both regular and irregular cutting tool wear. While cutting speed and feed rate proved to be influential parameter on the depicted temperatures and depth of cut to be less influential. Generally, it is observed that lower heat and temperatures are generated when coated inserts are employed. It is found that cutting temperatures are gradually increased as edge wear and deformation developed.

  14. G-CNV: A GPU-Based Tool for Preparing Data to Detect CNVs with Read-Depth Methods.

    PubMed

    Manconi, Andrea; Manca, Emanuele; Moscatelli, Marco; Gnocchi, Matteo; Orro, Alessandro; Armano, Giuliano; Milanesi, Luciano

    2015-01-01

    Copy number variations (CNVs) are the most prevalent types of structural variations (SVs) in the human genome and are involved in a wide range of common human diseases. Different computational methods have been devised to detect this type of SVs and to study how they are implicated in human diseases. Recently, computational methods based on high-throughput sequencing (HTS) are increasingly used. The majority of these methods focus on mapping short-read sequences generated from a donor against a reference genome to detect signatures distinctive of CNVs. In particular, read-depth based methods detect CNVs by analyzing genomic regions with significantly different read-depth from the other ones. The pipeline analysis of these methods consists of four main stages: (i) data preparation, (ii) data normalization, (iii) CNV regions identification, and (iv) copy number estimation. However, available tools do not support most of the operations required at the first two stages of this pipeline. Typically, they start the analysis by building the read-depth signal from pre-processed alignments. Therefore, third-party tools must be used to perform most of the preliminary operations required to build the read-depth signal. These data-intensive operations can be efficiently parallelized on graphics processing units (GPUs). In this article, we present G-CNV, a GPU-based tool devised to perform the common operations required at the first two stages of the analysis pipeline. G-CNV is able to filter low-quality read sequences, to mask low-quality nucleotides, to remove adapter sequences, to remove duplicated read sequences, to map the short-reads, to resolve multiple mapping ambiguities, to build the read-depth signal, and to normalize it. G-CNV can be efficiently used as a third-party tool able to prepare data for the subsequent read-depth signal generation and analysis. Moreover, it can also be integrated in CNV detection tools to generate read-depth signals.

  15. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures

    NASA Astrophysics Data System (ADS)

    Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-01

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.

  16. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  17. The discrete prolate spheroidal filter as a digital signal processing tool

    NASA Technical Reports Server (NTRS)

    Mathews, J. D.; Breakall, J. K.; Karawas, G. K.

    1983-01-01

    The discrete prolate spheriodall (DPS) filter is one of the glass of nonrecursive finite impulse response (FIR) filters. The DPS filter is superior to other filters in this class in that it has maximum energy concentration in the frequency passband and minimum ringing in the time domain. A mathematical development of the DPS filter properties is given, along with information required to construct the filter. The properties of this filter were compared with those of the more commonly used filters of the same class. Use of the DPS filter allows for particularly meaningful statements of data time/frequency resolution cell values. The filter forms an especially useful tool for digital signal processing.

  18. Epithelial Patterning, Morphogenesis, and Evolution: Drosophila Eggshell as a Model.

    PubMed

    Osterfield, Miriam; Berg, Celeste A; Shvartsman, Stanislav Y

    2017-05-22

    Understanding the mechanisms driving tissue and organ formation requires knowledge across scales. How do signaling pathways specify distinct tissue types? How does the patterning system control morphogenesis? How do these processes evolve? The Drosophila egg chamber, where EGF and BMP signaling intersect to specify unique cell types that construct epithelial tubes for specialized eggshell structures, has provided a tractable system to ask these questions. Work there has elucidated connections between scales of development, including across evolutionary scales, and fostered the development of quantitative modeling tools. These tools and general principles can be applied to the understanding of other developmental processes across organisms. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    NASA Astrophysics Data System (ADS)

    D'Amico, G.; Amodeo, A.; Mattis, I.; Freudenthaler, V.; Pappalardo, G.

    2015-10-01

    In this paper we describe an automatic tool for the pre-processing of lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. The ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, the ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. The ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of the ELPP module, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of the ELPP module is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of the ELPP module. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. The ELPP module has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  20. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  2. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul; Lammlein, David H.; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David R.; Hartman, Daniel A.

    2012-06-05

    An apparatus and computer program are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.

  3. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul [Boulder, CO; Lammlein, David H [Houston, TX; Cook, George E [Brentwood, TN; Wilkes, Don Mitchell [Nashville, TN; Strauss, Alvin M [Nashville, TN; Delapp, David R [Ashland City, TN; Hartman, Daniel A [Fairhope, AL

    2011-11-08

    Friction stir methods are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.

  4. The monitoring of transient regimes on machine tools based on speed, acceleration and active electric power absorbed by motors

    NASA Astrophysics Data System (ADS)

    Horodinca, M.

    2016-08-01

    This paper intend to propose some new results related with computer aided monitoring of transient regimes on machine-tools based on the evolution of active electrical power absorbed by the electric motor used to drive the main kinematic chains and the evolution of rotational speed and acceleration of the main shaft. The active power is calculated in numerical format using the evolution of instantaneous voltage and current delivered by electrical power system to the electric motor. The rotational speed and acceleration of the main shaft are calculated based on the signal delivered by a sensor. Three real-time analogic signals are acquired with a very simple computer assisted setup which contains a voltage transformer, a current transformer, an AC generator as rotational speed sensor, a data acquisition system and a personal computer. The data processing and analysis was done using Matlab software. Some different transient regimes were investigated; several important conclusions related with the advantages of this monitoring technique were formulated. Many others features of the experimental setup are also available: to supervise the mechanical loading of machine-tools during cutting processes or for diagnosis of machine-tools condition by active electrical power signal analysis in frequency domain.

  5. Development of Advanced Signal Processing and Source Imaging Methods for Superparamagnetic Relaxometry

    PubMed Central

    Huang, Ming-Xiong; Anderson, Bill; Huang, Charles W.; Kunde, Gerd J.; Vreeland, Erika C.; Huang, Jeffrey W.; Matlashov, Andrei N.; Karaulanov, Todor; Nettles, Christopher P.; Gomez, Andrew; Minser, Kayla; Weldon, Caroline; Paciotti, Giulio; Harsh, Michael; Lee, Roland R.; Flynn, Edward R.

    2017-01-01

    Superparamagnetic Relaxometry (SPMR) is a highly sensitive technique for the in vivo detection of tumor cells and may improve early stage detection of cancers. SPMR employs superparamagnetic iron oxide nanoparticles (SPION). After a brief magnetizing pulse is used to align the SPION, SPMR measures the time decay of SPION using Super-conducting Quantum Interference Device (SQUID) sensors. Substantial research has been carried out in developing the SQUID hardware and in improving the properties of the SPION. However, little research has been done in the pre-processing of sensor signals and post-processing source modeling in SPMR. In the present study, we illustrate new pre-processing tools that were developed to: 1) remove trials contaminated with artifacts, 2) evaluate and ensure that a single decay process associated with bounded SPION exists in the data, 3) automatically detect and correct flux jumps, and 4) accurately fit the sensor signals with different decay models. Furthermore, we developed an automated approach based on multi-start dipole imaging technique to obtain the locations and magnitudes of multiple magnetic sources, without initial guesses from the users. A regularization process was implemented to solve the ambiguity issue related to the SPMR source variables. A procedure based on reduced chi-square cost-function was introduced to objectively obtain the adequate number of dipoles that describe the data. The new pre-processing tools and multi-start source imaging approach have been successfully evaluated using phantom data. In conclusion, these tools and multi-start source modeling approach substantially enhance the accuracy and sensitivity in detecting and localizing sources from the SPMR signals. Furthermore, multi-start approach with regularization provided robust and accurate solutions for a poor SNR condition similar to the SPMR detection sensitivity in the order of 1000 cells. We believe such algorithms will help establishing the industrial standards for SPMR when applying the technique in pre-clinical and clinical settings. PMID:28072579

  6. Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1995-01-01

    A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.

  7. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  8. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  9. siGnum: graphical user interface for EMG signal analysis.

    PubMed

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  10. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures.

    PubMed

    Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-05

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  12. Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis

    NASA Astrophysics Data System (ADS)

    Zapata, J. F.; Garzón, J.

    2011-01-01

    This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.

  13. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  14. Application of HFCT and UHF Sensors in On-Line Partial Discharge Measurements for Insulation Diagnosis of High Voltage Equipment

    PubMed Central

    Álvarez, Fernando; Garnacho, Fernando; Ortego, Javier; Sánchez-Urán, Miguel Ángel

    2015-01-01

    Partial discharge (PD) measurements provide valuable information for assessing the condition of high voltage (HV) insulation systems, contributing to their quality assurance. Different PD measuring techniques have been developed in the last years specially designed to perform on-line measurements. Non-conventional PD methods operating in high frequency bands are usually used when this type of tests are carried out. In PD measurements the signal acquisition, the subsequent signal processing and the capability to obtain an accurate diagnosis are conditioned by the selection of a suitable detection technique and by the implementation of effective signal processing tools. This paper proposes an optimized electromagnetic detection method based on the combined use of wideband PD sensors for measurements performed in the HF and UHF frequency ranges, together with the implementation of powerful processing tools. The effectiveness of the measuring techniques proposed is demonstrated through an example, where several PD sources are measured simultaneously in a HV installation consisting of a cable system connected by a plug-in terminal to a gas insulated substation (GIS) compartment. PMID:25815452

  15. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  16. The Magnitude Response Learning Tool for DSP Education: A Case Study

    ERIC Educational Resources Information Center

    Kulmer, Florian; Wurzer, Christian Gun; Geiger, Bernhard C.

    2016-01-01

    Many concepts in digital signal processing are intuitive, despite being mathematically challenging. The lecturer not only has to teach the complicated math but should also help students develop intuition about the concept. To aid the lecturer in this task, the Magnitude Response Learning Tool has been introduced, a computer-based learning game…

  17. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  18. Experiences on developing digital down conversion algorithms using Xilinx system generator

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi

    2013-07-01

    The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.

  19. Tool vibration detection with eddy current sensors in machining process and computation of stability lobes using fuzzy classifiers

    NASA Astrophysics Data System (ADS)

    Devillez, Arnaud; Dudzinski, Daniel

    2007-01-01

    Today the knowledge of a process is very important for engineers to find optimal combination of control parameters warranting productivity, quality and functioning without defects and failures. In our laboratory, we carry out research in the field of high speed machining with modelling, simulation and experimental approaches. The aim of our investigation is to develop a software allowing the cutting conditions optimisation to limit the number of predictive tests, and the process monitoring to prevent any trouble during machining operations. This software is based on models and experimental data sets which constitute the knowledge of the process. In this paper, we deal with the problem of vibrations occurring during a machining operation. These vibrations may cause some failures and defects to the process, like workpiece surface alteration and rapid tool wear. To measure on line the tool micro-movements, we equipped a lathe with a specific instrumentation using eddy current sensors. Obtained signals were correlated with surface finish and a signal processing algorithm was used to determine if a test is stable or unstable. Then, a fuzzy classification method was proposed to classify the tests in a space defined by the width of cut and the cutting speed. Finally, it was shown that the fuzzy classification takes into account of the measurements incertitude to compute the stability limit or stability lobes of the process.

  20. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    NASA Astrophysics Data System (ADS)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  1. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  2. Precursor processing for plant peptide hormone maturation by subtilisin-like serine proteinases.

    PubMed

    Schardon, Katharina; Hohl, Mathias; Graff, Lucile; Pfannstiel, Jens; Schulze, Waltraud; Stintzi, Annick; Schaller, Andreas

    2016-12-23

    Peptide hormones that regulate plant growth and development are derived from larger precursor proteins by proteolytic processing. Our study addressed the role of subtilisin-like proteinases (SBTs) in this process. Using tissue-specific expression of proteinase inhibitors as a tool to overcome functional redundancy, we found that SBT activity was required for the maturation of IDA (INFLORESCENCE DEFICIENT IN ABSCISSION), a peptide signal for the abscission of floral organs in Arabidopsis We identified three SBTs that process the IDA precursor in vitro, and this processing was shown to be required for the formation of mIDA (the mature and bioactive form of IDA) as the endogenous signaling peptide in vivo. Hence, SBTs act as prohormone convertases in plants, and several functionally redundant SBTs contribute to signal biogenesis. Copyright © 2016, American Association for the Advancement of Science.

  3. Analysis of real-time vibration data

    USGS Publications Warehouse

    Safak, E.

    2005-01-01

    In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.

  4. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  5. Evaluation of Secretion Prediction Highlights Differing Approaches Needed for Oomycete and Fungal Effectors.

    PubMed

    Sperschneider, Jana; Williams, Angela H; Hane, James K; Singh, Karam B; Taylor, Jennifer M

    2015-01-01

    The steadily increasing number of sequenced fungal and oomycete genomes has enabled detailed studies of how these eukaryotic microbes infect plants and cause devastating losses in food crops. During infection, fungal and oomycete pathogens secrete effector molecules which manipulate host plant cell processes to the pathogen's advantage. Proteinaceous effectors are synthesized intracellularly and must be externalized to interact with host cells. Computational prediction of secreted proteins from genomic sequences is an important technique to narrow down the candidate effector repertoire for subsequent experimental validation. In this study, we benchmark secretion prediction tools on experimentally validated fungal and oomycete effectors. We observe that for a set of fungal SwissProt protein sequences, SignalP 4 and the neural network predictors of SignalP 3 (D-score) and SignalP 2 perform best. For effector prediction in particular, the use of a sensitive method can be desirable to obtain the most complete candidate effector set. We show that the neural network predictors of SignalP 2 and 3, as well as TargetP were the most sensitive tools for fungal effector secretion prediction, whereas the hidden Markov model predictors of SignalP 2 and 3 were the most sensitive tools for oomycete effectors. Thus, previous versions of SignalP retain value for oomycete effector prediction, as the current version, SignalP 4, was unable to reliably predict the signal peptide of the oomycete Crinkler effectors in the test set. Our assessment of subcellular localization predictors shows that cytoplasmic effectors are often predicted as not extracellular. This limits the reliability of secretion predictions that depend on these tools. We present our assessment with a view to informing future pathogenomics studies and suggest revised pipelines for secretion prediction to obtain optimal effector predictions in fungi and oomycetes.

  6. Smart surgical tool

    NASA Astrophysics Data System (ADS)

    Huang, Huan; Yang, Lih-Mei; Bai, Shuang; Liu, Jian

    2015-02-01

    A laser-induced breakdown spectroscopy (LIBS) guided smart surgical tool using a femtosecond fiber laser is developed. This system provides real-time material identification by processing and analyzing the peak intensity and ratio of atomic emissions of LIBS signals. Algorithms to identify emissions of different tissues and metals are developed and implemented into the real-time control system. This system provides a powerful smart surgical tool for precise robotic microsurgery applications with real-time feedback and control.

  7. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  8. Exciting (and modulating) very-long-period seismic signals on White Island, New Zealand

    NASA Astrophysics Data System (ADS)

    Neuberg, Jurgen; Jolly, Art

    2014-05-01

    Very-long-period seismic signals (VLP) on volcanoes can be used to fill the gap between classic seismology and deformation studies. In this contribution we reiterate the principal processing steps to retrieve from a velocity seismogram 3D ground displacement with tiny amplitudes far beyond the resolution of GPS. As a case study we use several seismic and infrasonic signals of volcanic events from White Island, New Zealand. We apply particle motion analysis and deformation modelling tools to the resulting displacement signals and exam the potential link between ground displacement and the modulation of harmonic tremor, in turn linked to a hydrothermal system. In this way we want to demonstrate the full potential of VLPs in monitoring and modelling of volcanic processes.

  9. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  10. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  11. A sequential method for spline approximation with variable knots. [recursive piecewise polynomial signal processing

    NASA Technical Reports Server (NTRS)

    Mier Muth, A. M.; Willsky, A. S.

    1978-01-01

    In this paper we describe a method for approximating a waveform by a spline. The method is quite efficient, as the data are processed sequentially. The basis of the approach is to view the approximation problem as a question of estimation of a polynomial in noise, with the possibility of abrupt changes in the highest derivative. This allows us to bring several powerful statistical signal processing tools into play. We also present some initial results on the application of our technique to the processing of electrocardiograms, where the knot locations themselves may be some of the most important pieces of diagnostic information.

  12. Nonparametric Simulation of Signal Transduction Networks with Semi-Synchronized Update

    PubMed Central

    Nassiri, Isar; Masoudi-Nejad, Ali; Jalili, Mahdi; Moeini, Ali

    2012-01-01

    Simulating signal transduction in cellular signaling networks provides predictions of network dynamics by quantifying the changes in concentration and activity-level of the individual proteins. Since numerical values of kinetic parameters might be difficult to obtain, it is imperative to develop non-parametric approaches that combine the connectivity of a network with the response of individual proteins to signals which travel through the network. The activity levels of signaling proteins computed through existing non-parametric modeling tools do not show significant correlations with the observed values in experimental results. In this work we developed a non-parametric computational framework to describe the profile of the evolving process and the time course of the proportion of active form of molecules in the signal transduction networks. The model is also capable of incorporating perturbations. The model was validated on four signaling networks showing that it can effectively uncover the activity levels and trends of response during signal transduction process. PMID:22737250

  13. Signal evaluation environment: a new method for the design of peripheral in-vehicle warning signals.

    PubMed

    Werneke, Julia; Vollrath, Mark

    2011-06-01

    An evaluation method called the Signal Evaluation Environment (SEE) was developed for use in the early stages of the design process of peripheral warning signals while driving. Accident analyses have shown that with complex driving situations such as intersections, the visual scan strategies of the driver contribute to overlooking other road users who have the right of way. Salient peripheral warning signals could disrupt these strategies and direct drivers' attention towards these road users. To select effective warning signals, the SEE was developed as a laboratory task requiring visual-cognitive processes similar to those used at intersections. For validation of the SEE, four experiments were conducted using different stimulus characteristics (size, colour contrast, shape, flashing) that influence peripheral vision. The results confirm that the SEE is able to differentiate between the selected stimulus characteristics. The SEE is a useful initial tool for designing peripheral signals, allowing quick and efficient preselection of beneficial signals.

  14. Quantifying ubiquitin signaling.

    PubMed

    Ordureau, Alban; Münch, Christian; Harper, J Wade

    2015-05-21

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), including phosphorylation. Flux through such pathways is dictated by the fractional stoichiometry of distinct modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events, illustrated with the PINK1/PARKIN pathway. A key feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. FPGA-based real-time swept-source OCT systems for B-scan live-streaming or volumetric imaging

    NASA Astrophysics Data System (ADS)

    Bandi, Vinzenz; Goette, Josef; Jacomet, Marcel; von Niederhäusern, Tim; Bachmann, Adrian H.; Duelk, Marcus

    2013-03-01

    We have developed a Swept-Source Optical Coherence Tomography (Ss-OCT) system with high-speed, real-time signal processing on a commercially available Data-Acquisition (DAQ) board with a Field-Programmable Gate Array (FPGA). The Ss-OCT system simultaneously acquires OCT and k-clock reference signals at 500MS/s. From the k-clock signal of each A-scan we extract a remap vector for the k-space linearization of the OCT signal. The linear but oversampled interpolation is followed by a 2048-point FFT, additional auxiliary computations, and a data transfer to a host computer for real-time, live-streaming of B-scan or volumetric C-scan OCT visualization. We achieve a 100 kHz A-scan rate by parallelization of our hardware algorithms, which run on standard and affordable, commercially available DAQ boards. Our main development tool for signal analysis as well as for hardware synthesis is MATLAB® with add-on toolboxes and 3rd-party tools.

  16. 50 Years of Army Computing From ENIAC to MSRC

    DTIC Science & Technology

    2000-09-01

    processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we

  17. Introduction to the Discrete Fourier Series Considering Both Mathematical and Engineering Aspects--A Linear Algebra Approach

    ERIC Educational Resources Information Center

    Kohaupt, Ludwig

    2015-01-01

    The discrete Fourier series is a valuable tool developed and used by mathematicians and engineers alike. One of the most prominent applications is signal processing. Usually, it is important that the signals be transmitted fast, for example, when transmitting images over large distances such as between the moon and the earth or when generating…

  18. Focus issue: teaching tools and learning opportunities.

    PubMed

    Gough, Nancy R

    2010-04-27

    Science Signaling provides authoring experience for students and resources for educators. Students experience the writing and revision process involved in authoring short commentary articles that are published in the Journal Club section. By publishing peer-reviewed teaching materials, Science Signaling provides instructors with feedback that improves their materials and an outlet to share their tips and techniques and digital resources with other teachers.

  19. Beamforming array techniques for acoustic emission monitoring of large concrete structures

    NASA Astrophysics Data System (ADS)

    McLaskey, Gregory C.; Glaser, Steven D.; Grosse, Christian U.

    2010-06-01

    This paper introduces a novel method of acoustic emission (AE) analysis which is particularly suited for field applications on large plate-like reinforced concrete structures, such as walls and bridge decks. Similar to phased-array signal processing techniques developed for other non-destructive evaluation methods, this technique adapts beamforming tools developed for passive sonar and seismological applications for use in AE source localization and signal discrimination analyses. Instead of relying on the relatively weak P-wave, this method uses the energy-rich Rayleigh wave and requires only a small array of 4-8 sensors. Tests on an in-service reinforced concrete structure demonstrate that the azimuth of an artificial AE source can be determined via this method for sources located up to 3.8 m from the sensor array, even when the P-wave is undetectable. The beamforming array geometry also allows additional signal processing tools to be implemented, such as the VESPA process (VElocity SPectral Analysis), whereby the arrivals of different wave phases are identified by their apparent velocity of propagation. Beamforming AE can reduce sampling rate and time synchronization requirements between spatially distant sensors which in turn facilitates the use of wireless sensor networks for this application.

  20. The neural basis of functional neuroimaging signal with positron and single-photon emission tomography.

    PubMed

    Sestini, S

    2007-07-01

    Functional imaging techniques such as positron and single-photon emission tomography exploit the relationship between neural activity, energy demand and cerebral blood flow to functionally map the brain. Despite the fact that neurobiological processes are not completely understood, several results have revealed the signals that trigger the metabolic and vascular changes accompanying variations in neural activity. Advances in this field have demonstrated that release of the major excitatory neurotransmitter glutamate initiates diverse signaling processes between neurons, astrocytes and blood perfusion, and that this signaling is crucial for the occurrence of brain imaging signals. Better understanding of the neural sites of energy consumption and the temporal correlation between energy demand, energy consumption and associated cerebrovascular hemodynamics gives novel insight into the potential of these imaging tools in the study of metabolic neurodegenerative disorders.

  1. Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution

    NASA Technical Reports Server (NTRS)

    Zoladz, T. F.; Jones, J. H.; Jong, J.

    1992-01-01

    A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.

  2. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    NASA Astrophysics Data System (ADS)

    Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.

    2012-05-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.

  3. Partial Discharge Spectral Characterization in HF, VHF and UHF Bands Using Particle Swarm Optimization.

    PubMed

    Robles, Guillermo; Fresno, José Manuel; Martínez-Tarifa, Juan Manuel; Ardila-Rey, Jorge Alfredo; Parrado-Hernández, Emilio

    2018-03-01

    The measurement of partial discharge (PD) signals in the radio frequency (RF) range has gained popularity among utilities and specialized monitoring companies in recent years. Unfortunately, in most of the occasions the data are hidden by noise and coupled interferences that hinder their interpretation and renders them useless especially in acquisition systems in the ultra high frequency (UHF) band where the signals of interest are weak. This paper is focused on a method that uses a selective spectral signal characterization to feature each signal, type of partial discharge or interferences/noise, with the power contained in the most representative frequency bands. The technique can be considered as a dimensionality reduction problem where all the energy information contained in the frequency components is condensed in a reduced number of UHF or high frequency (HF) and very high frequency (VHF) bands. In general, dimensionality reduction methods make the interpretation of results a difficult task because the inherent physical nature of the signal is lost in the process. The proposed selective spectral characterization is a preprocessing tool that facilitates further main processing. The starting point is a clustering of signals that could form the core of a PD monitoring system. Therefore, the dimensionality reduction technique should discover the best frequency bands to enhance the affinity between signals in the same cluster and the differences between signals in different clusters. This is done maximizing the minimum Mahalanobis distance between clusters using particle swarm optimization (PSO). The tool is tested with three sets of experimental signals to demonstrate its capabilities in separating noise and PDs with low signal-to-noise ratio and separating different types of partial discharges measured in the UHF and HF/VHF bands.

  4. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    NASA Astrophysics Data System (ADS)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  5. Wavelet-Based Peak Detection and a New Charge Inference Procedure for MS/MS Implemented in ProteoWizard’s msConvert

    PubMed Central

    2015-01-01

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686

  6. Wavelet-based peak detection and a new charge inference procedure for MS/MS implemented in ProteoWizard's msConvert.

    PubMed

    French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L

    2015-02-06

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.

  7. An adaptive spatio-temporal Gaussian filter for processing cardiac optical mapping data.

    PubMed

    Pollnow, S; Pilia, N; Schwaderlapp, G; Loewe, A; Dössel, O; Lenis, G

    2018-06-04

    Optical mapping is widely used as a tool to investigate cardiac electrophysiology in ex vivo preparations. Digital filtering of fluorescence-optical data is an important requirement for robust subsequent data analysis and still a challenge when processing data acquired from thin mammalian myocardium. Therefore, we propose and investigate the use of an adaptive spatio-temporal Gaussian filter for processing optical mapping signals from these kinds of tissue usually having low signal-to-noise ratio (SNR). We demonstrate how filtering parameters can be chosen automatically without additional user input. For systematic comparison of this filter with standard filtering methods from the literature, we generated synthetic signals representing optical recordings from atrial myocardium of a rat heart with varying SNR. Furthermore, all filter methods were applied to experimental data from an ex vivo setup. Our developed filter outperformed the other filter methods regarding local activation time detection at SNRs smaller than 3 dB which are typical noise ratios expected in these signals. At higher SNRs, the proposed filter performed slightly worse than the methods from literature. In conclusion, the proposed adaptive spatio-temporal Gaussian filter is an appropriate tool for investigating fluorescence-optical data with low SNR. The spatio-temporal filter parameters were automatically adapted in contrast to the other investigated filters. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Studies of acoustic emission from point and extended sources

    NASA Technical Reports Server (NTRS)

    Sachse, W.; Kim, K. Y.; Chen, C. P.

    1986-01-01

    The use of simulated and controlled acoustic emission signals forms the basis of a powerful tool for the detailed study of various deformation and wave interaction processes in materials. The results of experiments and signal analyses of acoustic emission resulting from point sources such as various types of indentation-produced cracks in brittle materials and the growth of fatigue cracks in 7075-T6 aluminum panels are discussed. Recent work dealing with the modeling and subsequent signal processing of an extended source of emission in a material is reviewed. Results of the forward problem and the inverse problem are presented with the example of a source distributed through the interior of a specimen.

  9. Accurate derivation of heart rate variability signal for detection of sleep disordered breathing in children.

    PubMed

    Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S

    2004-01-01

    The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.

  10. Load-induced modulation of signal transduction networks.

    PubMed

    Jiang, Peng; Ventura, Alejandra C; Sontag, Eduardo D; Merajver, Sofia D; Ninfa, Alexander J; Del Vecchio, Domitilla

    2011-10-11

    Biological signal transduction networks are commonly viewed as circuits that pass along information--in the process amplifying signals, enhancing sensitivity, or performing other signal-processing tasks--to transcriptional and other components. Here, we report on a "reverse-causality" phenomenon, which we call load-induced modulation. Through a combination of analytical and experimental tools, we discovered that signaling was modulated, in a surprising way, by downstream targets that receive the signal and, in doing so, apply what in physics is called a load. Specifically, we found that non-intuitive changes in response dynamics occurred for a covalent modification cycle when load was present. Loading altered the response time of a system, depending on whether the activity of one of the enzymes was maximal and the other was operating at its minimal rate or whether both enzymes were operating at submaximal rates. These two conditions, which we call "limit regime" and "intermediate regime," were associated with increased or decreased response times, respectively. The bandwidth, the range of frequency in which the system can process information, decreased in the presence of load, suggesting that downstream targets participate in establishing a balance between noise-filtering capabilities and a circuit's ability to process high-frequency stimulation. Nodes in a signaling network are not independent relay devices, but rather are modulated by their downstream targets.

  11. Neural signal registration and analysis of axons grown in microchannels

    NASA Astrophysics Data System (ADS)

    Pigareva, Y.; Malishev, E.; Gladkov, A.; Kolpakov, V.; Bukatin, A.; Mukhina, I.; Kazantsev, V.; Pimashkin, A.

    2016-08-01

    Registration of neuronal bioelectrical signals remains one of the main physical tools to study fundamental mechanisms of signal processing in the brain. Neurons generate spiking patterns which propagate through complex map of neural network connectivity. Extracellular recording of isolated axons grown in microchannels provides amplification of the signal for detailed study of spike propagation. In this study we used neuronal hippocampal cultures grown in microfluidic devices combined with microelectrode arrays to investigate a changes of electrical activity during neural network development. We found that after 5 days in vitro after culture plating the spiking activity appears first in microchannels and on the next 2-3 days appears on the electrodes of overall neural network. We conclude that such approach provides a convenient method to study neural signal processing and functional structure development on a single cell and network level of the neuronal culture.

  12. Shallow Water Reverberation Measurement and Prediction

    DTIC Science & Technology

    1994-06-01

    tool . The temporal signal processing consisted of a short-time Fourier transform spectral estimation method applied to data from a single hydrophone...The three-dimensional Hamiltonian Acoustic Ray-tracing Program for the Ocean (HARPO) was used as the primary propagation modeling tool . The temporal...summarizes the work completed and discusses lessons learned . Advice regarding future work to refine the present study will be provided. 6 our poiut source

  13. Surface Enhanced Raman Spectroscopy (SERS) and multivariate analysis as a screening tool for detecting Sudan I dye in culinary spices

    NASA Astrophysics Data System (ADS)

    Di Anibal, Carolina V.; Marsal, Lluís F.; Callao, M. Pilar; Ruisánchez, Itziar

    2012-02-01

    Raman spectroscopy combined with multivariate analysis was evaluated as a tool for detecting Sudan I dye in culinary spices. Three Raman modalities were studied: normal Raman, FT-Raman and SERS. The results show that SERS is the most appropriate modality capable of providing a proper Raman signal when a complex matrix is analyzed. To get rid of the spectral noise and background, Savitzky-Golay smoothing with polynomial baseline correction and wavelet transform were applied. Finally, to check whether unadulterated samples can be differentiated from samples adulterated with Sudan I dye, an exploratory analysis such as principal component analysis (PCA) was applied to raw data and data processed with the two mentioned strategies. The results obtained by PCA show that Raman spectra need to be properly treated if useful information is to be obtained and both spectra treatments are appropriate for processing the Raman signal. The proposed methodology shows that SERS combined with appropriate spectra treatment can be used as a practical screening tool to distinguish samples suspicious to be adulterated with Sudan I dye.

  14. Understanding the impact of TV commercials: electrical neuroimaging.

    PubMed

    Vecchiato, Giovanni; Kong, Wanzeng; Maglione, Anton Giulio; Wei, Daming

    2012-01-01

    Today, there is a greater interest in the marketing world in using neuroimaging tools to evaluate the efficacy of TV commercials. This field of research is known as neuromarketing. In this article, we illustrate some applications of electrical neuroimaging, a discipline that uses electroencephalography (EEG) and intensive signal processing techniques for the evaluation of marketing stimuli. We also show how the proper usage of these methodologies can provide information related to memorization and attention while people are watching marketing-relevant stimuli. We note that temporal and frequency patterns of EEG signals are able to provide possible descriptors that convey information about the cognitive process in subjects observing commercial advertisements (ads). Such information could be unobtainable through common tools used in standard marketing research. Evidence of this research shows how EEG methodologies could be employed to better design new products that marketers are going to promote and to analyze the global impact of video commercials already broadcast on TV.

  15. An Open-Source Hardware and Software System for Acquisition and Real-Time Processing of Electrophysiology during High Field MRI

    PubMed Central

    Purdon, Patrick L.; Millan, Hernan; Fuller, Peter L.; Bonmassar, Giorgio

    2008-01-01

    Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open source system for simultaneous electrophysiology and fMRI featuring low-noise (< 0.6 uV p-p input noise), electromagnetic compatibility for MRI (tested up to 7 Tesla), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has used in human EEG/fMRI studies at 3 and 7 Tesla examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3 Tesla fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level. PMID:18761038

  16. An open-source hardware and software system for acquisition and real-time processing of electrophysiology during high field MRI.

    PubMed

    Purdon, Patrick L; Millan, Hernan; Fuller, Peter L; Bonmassar, Giorgio

    2008-11-15

    Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open-source system for simultaneous electrophysiology and fMRI featuring low-noise (<0.6microV p-p input noise), electromagnetic compatibility for MRI (tested up to 7T), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has been used in human EEG/fMRI studies at 3 and 7T examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3T fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level.

  17. Method and system for downhole clock synchronization

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte; Moon, Justin; Koehler, Roger O.

    2006-11-28

    A method and system for use in synchronizing at least two clocks in a downhole network are disclosed. The method comprises determining a total signal latency between a controlling processing element and at least one downhole processing element in a downhole network and sending a synchronizing time over the downhole network to the at least one downhole processing element adjusted for the signal latency. Electronic time stamps may be used to measure latency between processing elements. A system for electrically synchronizing at least two clocks connected to a downhole network comprises a controlling processing element connected to a synchronizing clock in communication over a downhole network with at least one downhole processing element comprising at least one downhole clock. Preferably, the downhole network is integrated into a downhole tool string.

  18. Implementation of Bluetooth technology in processing aspheric mirrors

    NASA Astrophysics Data System (ADS)

    Chen, Dong-yun; Li, Xiao-jin

    2010-10-01

    This paper adopts the Bluetooth wireless transmission to replace the conducting rings currently using in the active lap process to overcome the cost and abrasion problems brought by the conducting rings, which has great significance for reducing the costs of processing large aspheric mirrors. Based on the actual application requirements, Article proposes the overall program of using Bluetooth technology as data transmission, including the active lap-side and machine tool-side: In the machine tool-side, the MCU separately connects with Bluetooth module and the sensor via UART0 and UART1 serial port, and when the MCU receives the signals sending from the sensor, the MCU packs and then sends them through the Bluetooth module; while in the active lap side, the CCAL reads-out the position signals of sensor detecting in dual-port memory via one-side ports, and the other side ports connect with the MCU's high ports P4-P7, so the MCU can unpacks and stores the position signals receiving via Bluetooth module. This paper designs and implements the system's hardware circuit, and mainly introduces the ways of serial and parallel. Based upon the realized system, design the test program for the Bluetooth wireless transmission and the experiment results, in the condition of the active lap processing large aspheric mirrors, showed that Bluetooth technology can meet the requirements of practical applications.

  19. Wavelet Applications for Flight Flutter Testing

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty; Freudinger, Lawrence C.

    1999-01-01

    Wavelets present a method for signal processing that may be useful for analyzing responses of dynamical systems. This paper describes several wavelet-based tools that have been developed to improve the efficiency of flight flutter testing. One of the tools uses correlation filtering to identify properties of several modes throughout a flight test for envelope expansion. Another tool uses features in time-frequency representations of responses to characterize nonlinearities in the system dynamics. A third tool uses modulus and phase information from a wavelet transform to estimate modal parameters that can be used to update a linear model and reduce conservatism in robust stability margins.

  20. Models and signal processing for an implanted ethanol bio-sensor.

    PubMed

    Han, Jae-Joon; Doerschuk, Peter C; Gelfand, Saul B; O'Connor, Sean J

    2008-02-01

    The understanding of drinking patterns leading to alcoholism has been hindered by an inability to unobtrusively measure ethanol consumption over periods of weeks to months in the community environment. An implantable ethanol sensor is under development using microelectromechanical systems technology. For safety and user acceptability issues, the sensor will be implanted subcutaneously and, therefore, measure peripheral-tissue ethanol concentration. Determining ethanol consumption and kinetics in other compartments from the time course of peripheral-tissue ethanol concentration requires sophisticated signal processing based on detailed descriptions of the relevant physiology. A statistical signal processing system based on detailed models of the physiology and using extended Kalman filtering and dynamic programming tools is described which can estimate the time series of ethanol concentration in blood, liver, and peripheral tissue and the time series of ethanol consumption based on peripheral-tissue ethanol concentration measurements.

  1. The analysis of decimation and interpolation in the linear canonical transform domain.

    PubMed

    Xu, Shuiqing; Chai, Yi; Hu, Youqiang; Huang, Lei; Feng, Li

    2016-01-01

    Decimation and interpolation are the two basic building blocks in the multirate digital signal processing systems. As the linear canonical transform (LCT) has been shown to be a powerful tool for optics and signal processing, it is worthwhile and interesting to analyze the decimation and interpolation in the LCT domain. In this paper, the definition of equivalent filter in the LCT domain have been given at first. Then, by applying the definition, the direct implementation structure and polyphase networks for decimator and interpolator in the LCT domain have been proposed. Finally, the perfect reconstruction expressions for differential filters in the LCT domain have been presented as an application. The proposed theorems in this study are the bases for generalizations of the multirate signal processing in the LCT domain, which can advance the filter banks theorems in the LCT domain.

  2. Förster resonance energy transfer as a tool to study photoreceptor biology

    PubMed Central

    Hovan, Stephanie C.; Howell, Scott; Park, Paul S.-H.

    2010-01-01

    Vision is initiated in photoreceptor cells of the retina by a set of biochemical events called phototransduction. These events occur via coordinated dynamic processes that include changes in secondary messenger concentrations, conformational changes and post-translational modifications of signaling proteins, and protein-protein interactions between signaling partners. A complete description of the orchestration of these dynamic processes is still unavailable. Described in this work is the first step in the development of tools combining fluorescent protein technology, Förster resonance energy transfer (FRET), and transgenic animals that have the potential to reveal important molecular insights about the dynamic processes occurring in photoreceptor cells. We characterize the fluorescent proteins SCFP3A and SYFP2 for use as a donor-acceptor pair in FRET assays, which will facilitate the visualization of dynamic processes in living cells. We also demonstrate the targeted expression of these fluorescent proteins to the rod photoreceptor cells of Xenopus laevis, and describe a general method for detecting FRET in these cells. The general approaches described here can address numerous types of questions related to phototransduction and photoreceptor biology by providing a platform to visualize dynamic processes in molecular detail within a native context. PMID:21198205

  3. Interrogation of Cellular Innate Immunity by Diamond-Nanoneedle-Assisted Intracellular Molecular Fishing.

    PubMed

    Wang, Zixun; Yang, Yang; Xu, Zhen; Wang, Ying; Zhang, Wenjun; Shi, Peng

    2015-10-14

    Understanding intracellular signaling cascades and network is one of the core topics in modern biology. Novel tools based on nanotechnologies have enabled probing and analyzing intracellular signaling with unprecedented sensitivity and specificity. In this study, we developed a minimally invasive method for in situ probing specific signaling components of cellular innate immunity in living cells. The technique was based on diamond-nanoneedle arrays functionalized with aptamer-based molecular sensors, which were inserted into cytoplasmic domain using a centrifugation controlled process to capture molecular targets. Simultaneously, these diamond-nanoneedles also facilitated the delivery of double-strand DNAs (dsDNA90) into cells to activate the pathway involving the stimulator of interferon genes (STING). We showed that the nanoneedle-based biosensors can be successfully utilized to isolate transcriptional factor, NF-κB, from intracellular regions without damaging the cells, upon STING activation. By using a reversible protocol and repeated probing in living cells, we were able to examine the singling dynamics of NF-κB, which was quickly translocated from cytoplasm to nucleus region within ∼40 min of intracellular introduction of dsDNA90 for both A549 and neuron cells. These results demonstrated a novel and versatile tool for targeted in situ dissection of intracellular signaling, providing the potential to resolve new sights into various cellular processes.

  4. A high performance biometric signal and image processing method to reveal blood perfusion towards 3D oxygen saturation mapping

    NASA Astrophysics Data System (ADS)

    Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron

    2014-03-01

    Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.

  5. Signals and circuits in the purkinje neuron.

    PubMed

    Abrams, Zéev R; Zhang, Xiang

    2011-01-01

    Purkinje neurons (PN) in the cerebellum have over 100,000 inputs organized in an orthogonal geometry, and a single output channel. As the sole output of the cerebellar cortex layer, their complex firing pattern has been associated with motor control and learning. As such they have been extensively modeled and measured using tools ranging from electrophysiology and neuroanatomy, to dynamic systems and artificial intelligence methods. However, there is an alternative approach to analyze and describe the neuronal output of these cells using concepts from electrical engineering, particularly signal processing and digital/analog circuits. By viewing the PN as an unknown circuit to be reverse-engineered, we can use the tools that provide the foundations of today's integrated circuits and communication systems to analyze the Purkinje system at the circuit level. We use Fourier transforms to analyze and isolate the inherent frequency modes in the PN and define three unique frequency ranges associated with the cells' output. Comparing the PN to a signal generator that can be externally modulated adds an entire level of complexity to the functional role of these neurons both in terms of data analysis and information processing, relying on Fourier analysis methods in place of statistical ones. We also re-describe some of the recent literature in the field, using the nomenclature of signal processing. Furthermore, by comparing the experimental data of the past decade with basic electronic circuitry, we can resolve the outstanding controversy in the field, by recognizing that the PN can act as a multivibrator circuit.

  6. Research on intelligent monitoring technology of machining process

    NASA Astrophysics Data System (ADS)

    Wang, Taiyong; Meng, Changhong; Zhao, Guoli

    1995-08-01

    Based upon research on sound and vibration characteristics of tool condition, we explore the multigrade monitoring system which takes single-chip microcomputers as the core hardware. By using the specially designed pickup true signal devices, we can more effectively do the intelligent multigrade monitoring and forecasting, and furthermore, we can build the tool condition models adaptively. This is the key problem in FMS, CIMS, and even the IMS.

  7. Visualizing Time: How Linguistic Metaphors Are Incorporated into Displaying Instruments in the Process of Interpreting Time-Varying Signals

    ERIC Educational Resources Information Center

    Garcia-Belmonte, Germà

    2017-01-01

    Spatial visualization is a well-established topic of education research that has allowed improving science and engineering students' skills on spatial relations. Connections have been established between visualization as a comprehension tool and instruction in several scientific fields. Learning about dynamic processes mainly relies upon static…

  8. Study on a Real-Time BEAM System for Diagnosis Assistance Based on a System on Chips Design

    PubMed Central

    Sung, Wen-Tsai; Chen, Jui-Ho; Chang, Kung-Wei

    2013-01-01

    As an innovative as well as an interdisciplinary research project, this study performed an analysis of brain signals so as to establish BrainIC as an auxiliary tool for physician diagnosis. Cognition behavior sciences, embedded technology, system on chips (SOC) design and physiological signal processing are integrated in this work. Moreover, a chip is built for real-time electroencephalography (EEG) processing purposes and a Brain Electrical Activity Mapping (BEAM) system, and a knowledge database is constructed to diagnose psychosis and body challenges in learning various behaviors and signals antithesis by a fuzzy inference engine. This work is completed with a medical support system developed for the mentally disabled or the elderly abled. PMID:23681095

  9. Ramanujan sums for signal processing of low-frequency noise.

    PubMed

    Planat, Michel; Rosu, Haret; Perrine, Serge

    2002-11-01

    An aperiodic (low-frequency) spectrum may originate from the error term in the mean value of an arithmetical function such as Möbius function or Mangoldt function, which are coding sequences for prime numbers. In the discrete Fourier transform the analyzing wave is periodic and not well suited to represent the low-frequency regime. In place we introduce a different signal processing tool based on the Ramanujan sums c(q)(n), well adapted to the analysis of arithmetical sequences with many resonances p/q. The sums are quasiperiodic versus the time n and aperiodic versus the order q of the resonance. Different results arise from the use of this Ramanujan-Fourier transform in the context of arithmetical and experimental signals.

  10. Ramanujan sums for signal processing of low-frequency noise

    NASA Astrophysics Data System (ADS)

    Planat, Michel; Rosu, Haret; Perrine, Serge

    2002-11-01

    An aperiodic (low-frequency) spectrum may originate from the error term in the mean value of an arithmetical function such as Möbius function or Mangoldt function, which are coding sequences for prime numbers. In the discrete Fourier transform the analyzing wave is periodic and not well suited to represent the low-frequency regime. In place we introduce a different signal processing tool based on the Ramanujan sums cq(n), well adapted to the analysis of arithmetical sequences with many resonances p/q. The sums are quasiperiodic versus the time n and aperiodic versus the order q of the resonance. Different results arise from the use of this Ramanujan-Fourier transform in the context of arithmetical and experimental signals.

  11. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  12. Experimental evaluation of tool run-out in micro milling

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  13. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  14. Process and system - A dual definition, revisited with consequences in metrology

    NASA Astrophysics Data System (ADS)

    Ruhm, K. H.

    2010-07-01

    Lets assert that metrology life could be easier scientifically as well as technologically, if we, intentionally, would make an explicit distinction between two outstanding domains, namely the given, really existent domain of processes and the just virtually existent domain of systems, the latter of which is designed and used by the human mind. The abstract domain of models, by which we map the manifold reality of processes, is itself part of the domain of systems. Models support comprehension and communication, although they are normally extreme simplifications of properties and behaviour of a concrete reality. So, systems and signals represent processes and quantities, which are described by means of Signal and System Theory as well as by Stochastics and Statistics. The following presentation of this new, demanding and somehow irritating definition of the terms process and system as a dual pair is unusual indeed, but it opens the door widely to a better and more consistent discussion and understanding of manifold scientific tools in many areas. Metrology [4] is one of the important fields of concern due to many reasons: One group of the soft and hard links between the domain of processes and the domain of systems is realised by concepts of measurement science on the one hand and by instrumental tools of measurement technology on the other hand.

  15. Acoustic/seismic signal propagation and sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Marlin, David H.; Mackay, Sean

    2007-04-01

    Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).

  16. Engineered Chloroplast Genome just got Smarter

    PubMed Central

    Jin, Shuangxia; Daniell, Henry

    2015-01-01

    Chloroplasts are known to sustain life on earth by providing food, fuel and oxygen through the process of photosynthesis. However, the chloroplast genome has also been smartly engineered to confer valuable agronomic traits and/or serve as bioreactors for production of industrial enzymes, biopharmaceuticals, bio-products or vaccines. The recent breakthrough in hyper-expression of biopharmaceuticals in edible leaves has facilitated the advancement to clinical studies by major pharmaceutical companies. This review critically evaluates progress in developing new tools to enhance or simplify expression of targeted genes in chloroplasts. These tools hold the promise to further the development of novel fuels and products, enhance the photosynthetic process, and increase our understanding of retrograde signaling and cellular processes. PMID:26440432

  17. PPM Receiver Implemented in Software

    NASA Technical Reports Server (NTRS)

    Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement

    2010-01-01

    A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.

  18. Simulink/PARS Integration Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, B.; Nakhaee, N.

    2013-12-18

    The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less

  19. Innovations for the future of pharmacovigilance.

    PubMed

    Almenoff, June S

    2007-01-01

    Post-marketing pharmacovigilance involves the review and management of safety information from many sources. Among these sources, spontaneous adverse event reporting systems are among the most challenging and resource-intensive to manage. Traditionally, efforts to monitor spontaneous adverse event reporting systems have focused on review of individual case reports. The science of pharmacovigilance could be enhanced with the availability of systems-based tools that facilitate analysis of aggregate data for purposes of signal detection, signal evaluation and knowledge management. GlaxoSmithKline (GSK) recently implemented Online Signal Management (OSM) as a data-driven framework for managing the pharmacovigilance of marketed products. This pioneering work builds upon the strong history GSK has of innovation in this area. OSM is a software application co-developed by GSK and Lincoln Technologies that integrates traditional pharmacovigilance methods with modern quantitative statistical methods and data visualisation tools. OSM enables the rapid identification of trends from the individual adverse event reports received by GSK. OSM also provides knowledge-management tools to ensure the successful tracking of emerging safety issues. GSK has developed standard procedures and 'best practices' around the use of OSM to ensure the systematic evaluation of complex safety datasets. In summary, the implementation of OSM provides new tools and efficient processes to advance the science of pharmacovigilance.

  20. Modeling selective attention using a neuromorphic analog VLSI device.

    PubMed

    Indiveri, G

    2000-12-01

    Attentional mechanisms are required to overcome the problem of flooding a limited processing capacity system with information. They are present in biological sensory systems and can be a useful engineering tool for artificial visual systems. In this article we present a hardware model of a selective attention mechanism implemented on a very large-scale integration (VLSI) chip, using analog neuromorphic circuits. The chip exploits a spike-based representation to receive, process, and transmit signals. It can be used as a transceiver module for building multichip neuromorphic vision systems. We describe the circuits that carry out the main processing stages of the selective attention mechanism and provide experimental data for each circuit. We demonstrate the expected behavior of the model at the system level by stimulating the chip with both artificially generated control signals and signals obtained from a saliency map, computed from an image containing several salient features.

  1. Acoustic emission from single point machining: Part 2, Signal changes with tool wear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heiple, C.R.; Carpenter, S.H.; Armentrout, D.L.

    1989-01-01

    Changes in acoustic emission signal characteristics with tool wear were monitored during single point machining of 4340 steel and Ti-6Al-4V heat treated to several strength levels, 606l-T6 aluminum, 304 stainless steel, 17-4PH stainless steel, 410 stainless steel, lead, and teflon. No signal characteristic changed in the same way with tool wear for all materials tested. A single change in a particular AE signal characteristic with tool wear valid for all materials probably does not exist. Nevertheless, changes in various signal characteristic with wear for a given material may be sufficient to be used to monitor tool wear.

  2. Generalization of the Poincare sphere to process 2D displacement signals

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Lamberti, Luciano

    2017-06-01

    Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.

  3. Laser production of articles from powders

    DOEpatents

    Lewis, Gary K.; Milewski, John O.; Cremers, David A.; Nemec, Ronald B.; Barbe, Michael R.

    1998-01-01

    Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path.

  4. Laser production of articles from powders

    DOEpatents

    Lewis, G.K.; Milewski, J.O.; Cremers, D.A.; Nemec, R.B.; Barbe, M.R.

    1998-11-17

    Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path. 20 figs.

  5. Dataflow Design Tool: User's Manual

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1996-01-01

    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  6. Redox-capacitor to connect electrochemistry to redox-biology.

    PubMed

    Kim, Eunkyoung; Leverage, W Taylor; Liu, Yi; White, Ian M; Bentley, William E; Payne, Gregory F

    2014-01-07

    It is well-established that redox-reactions are integral to biology for energy harvesting (oxidative phosphorylation), immune defense (oxidative burst) and drug metabolism (phase I reactions), yet there is emerging evidence that redox may play broader roles in biology (e.g., redox signaling). A critical challenge is the need for tools that can probe biologically-relevant redox interactions simply, rapidly and without the need for a comprehensive suite of analytical methods. We propose that electrochemistry may provide such a tool. In this tutorial review, we describe recent studies with a redox-capacitor film that can serve as a bio-electrode interface that can accept, store and donate electrons from mediators commonly used in electrochemistry and also in biology. Specifically, we (i) describe the fabrication of this redox-capacitor from catechols and the polysaccharide chitosan, (ii) discuss the mechanistic basis for electron exchange, (iii) illustrate the properties of this redox-capacitor and its capabilities for promoting redox-communication between biology and electrodes, and (iv) suggest the potential for enlisting signal processing strategies to "extract" redox information. We believe these initial studies indicate broad possibilities for enlisting electrochemistry and signal processing to acquire "systems level" redox information from biology.

  7. Assessing denoising strategies to increase signal to noise ratio in spinal cord and in brain cortical and subcortical regions

    NASA Astrophysics Data System (ADS)

    Maugeri, L.; Moraschi, M.; Summers, P.; Favilla, S.; Mascali, D.; Cedola, A.; Porro, C. A.; Giove, F.; Fratini, M.

    2018-02-01

    Functional Magnetic Resonance Imaging (fMRI) based on Blood Oxygenation Level Dependent (BOLD) contrast has become one of the most powerful tools in neuroscience research. On the other hand, fMRI approaches have seen limited use in the study of spinal cord and subcortical brain regions (such as the brainstem and portions of the diencephalon). Indeed obtaining good BOLD signal in these areas still represents a technical and scientific challenge, due to poor control of physiological noise and to a limited overall quality of the functional series. A solution can be found in the combination of optimized experimental procedures at acquisition stage, and well-adapted artifact mitigation procedures in the data processing. In this framework, we studied two different data processing strategies to reduce physiological noise in cortical and subcortical brain regions and in the spinal cord, based on the aCompCor and RETROICOR denoising tools respectively. The study, performed in healthy subjects, was carried out using an ad hoc isometric motor task. We observed an increased signal to noise ratio in the denoised functional time series in the spinal cord and in the subcortical brain region.

  8. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition.

    PubMed

    Caggiano, Alessandra

    2018-03-09

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features ( k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear ( VB max ) was achieved, with predicted values very close to the measured tool wear values.

  9. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    PubMed Central

    2018-01-01

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features (k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax) was achieved, with predicted values very close to the measured tool wear values. PMID:29522443

  10. A New Method for Suppressing Periodic Narrowband Interference Based on the Chaotic van der Pol Oscillator

    NASA Astrophysics Data System (ADS)

    Lu, Jia; Zhang, Xiaoxing; Xiong, Hao

    The chaotic van der Pol oscillator is a powerful tool for detecting defects in electric systems by using online partial discharge (PD) monitoring. This paper focuses on realizing weak PD signal detection in the strong periodic narrowband interference by using high sensitivity to the periodic narrowband interference signals and immunity to white noise and PD signals of chaotic systems. A new approach to removing the periodic narrowband interference by using a van der Pol chaotic oscillator is described by analyzing the motion characteristic of the chaotic oscillator on the basis of the van der Pol equation. Furthermore, the Floquet index for measuring the amplitude of periodic narrowband signals is redefined. The denoising signal processed by the chaotic van der Pol oscillators is further processed by wavelet analysis. Finally, the denoising results verify that the periodic narrowband and white noise interference can be removed efficiently by combining the theory of the chaotic van der Pol oscillator and wavelet analysis.

  11. Editorial: Mathematical Methods and Modeling in Machine Fault Diagnosis

    DOE PAGES

    Yan, Ruqiang; Chen, Xuefeng; Li, Weihua; ...

    2014-12-18

    Modern mathematics has commonly been utilized as an effective tool to model mechanical equipment so that their dynamic characteristics can be studied analytically. This will help identify potential failures of mechanical equipment by observing change in the equipment’s dynamic parameters. On the other hand, dynamic signals are also important and provide reliable information about the equipment’s working status. Modern mathematics has also provided us with a systematic way to design and implement various signal processing methods, which are used to analyze these dynamic signals, and to enhance intrinsic signal components that are directly related to machine failures. This special issuemore » is aimed at stimulating not only new insights on mathematical methods for modeling but also recently developed signal processing methods, such as sparse decomposition with potential applications in machine fault diagnosis. Finally, the papers included in this special issue provide a glimpse into some of the research and applications in the field of machine fault diagnosis through applications of the modern mathematical methods.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Ruqiang; Chen, Xuefeng; Li, Weihua

    Modern mathematics has commonly been utilized as an effective tool to model mechanical equipment so that their dynamic characteristics can be studied analytically. This will help identify potential failures of mechanical equipment by observing change in the equipment’s dynamic parameters. On the other hand, dynamic signals are also important and provide reliable information about the equipment’s working status. Modern mathematics has also provided us with a systematic way to design and implement various signal processing methods, which are used to analyze these dynamic signals, and to enhance intrinsic signal components that are directly related to machine failures. This special issuemore » is aimed at stimulating not only new insights on mathematical methods for modeling but also recently developed signal processing methods, such as sparse decomposition with potential applications in machine fault diagnosis. Finally, the papers included in this special issue provide a glimpse into some of the research and applications in the field of machine fault diagnosis through applications of the modern mathematical methods.« less

  13. Uniform, optimal signal processing of mapped deep-sequencing data.

    PubMed

    Kumar, Vibhor; Muratani, Masafumi; Rayan, Nirmala Arul; Kraus, Petra; Lufkin, Thomas; Ng, Huck Hui; Prabhakar, Shyam

    2013-07-01

    Despite their apparent diversity, many problems in the analysis of high-throughput sequencing data are merely special cases of two general problems, signal detection and signal estimation. Here we adapt formally optimal solutions from signal processing theory to analyze signals of DNA sequence reads mapped to a genome. We describe DFilter, a detection algorithm that identifies regulatory features in ChIP-seq, DNase-seq and FAIRE-seq data more accurately than assay-specific algorithms. We also describe EFilter, an estimation algorithm that accurately predicts mRNA levels from as few as 1-2 histone profiles (R ∼0.9). Notably, the presence of regulatory motifs in promoters correlates more with histone modifications than with mRNA levels, suggesting that histone profiles are more predictive of cis-regulatory mechanisms. We show by applying DFilter and EFilter to embryonic forebrain ChIP-seq data that regulatory protein identification and functional annotation are feasible despite tissue heterogeneity. The mathematical formalism underlying our tools facilitates integrative analysis of data from virtually any sequencing-based functional profile.

  14. A novel DTI-QA tool: Automated metric extraction exploiting the sphericity of an agar filled phantom.

    PubMed

    Chavez, Sofia; Viviano, Joseph; Zamyadi, Mojdeh; Kingsley, Peter B; Kochunov, Peter; Strother, Stephen; Voineskos, Aristotle

    2018-02-01

    To develop a quality assurance (QA) tool (acquisition guidelines and automated processing) for diffusion tensor imaging (DTI) data using a common agar-based phantom used for fMRI QA. The goal is to produce a comprehensive set of automated, sensitive and robust QA metrics. A readily available agar phantom was scanned with and without parallel imaging reconstruction. Other scanning parameters were matched to the human scans. A central slab made up of either a thick slice or an average of a few slices, was extracted and all processing was performed on that image. The proposed QA relies on the creation of two ROIs for processing: (i) a preset central circular region of interest (ccROI) and (ii) a signal mask for all images in the dataset. The ccROI enables computation of average signal for SNR calculations as well as average FA values. The production of the signal masks enables automated measurements of eddy current and B0 inhomogeneity induced distortions by exploiting the sphericity of the phantom. Also, the signal masks allow automated background localization to assess levels of Nyquist ghosting. The proposed DTI-QA was shown to produce eleven metrics which are robust yet sensitive to image quality changes within site and differences across sites. It can be performed in a reasonable amount of scan time (~15min) and the code for automated processing has been made publicly available. A novel DTI-QA tool has been proposed. It has been applied successfully on data from several scanners/platforms. The novelty lies in the exploitation of the sphericity of the phantom for distortion measurements. Other novel contributions are: the computation of an SNR value per gradient direction for the diffusion weighted images (DWIs) and an SNR value per non-DWI, an automated background detection for the Nyquist ghosting measurement and an error metric reflecting the contribution of EPI instability to the eddy current induced shape changes observed for DWIs. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Advances in two photon scanning and scanless microscopy technologies for functional neural circuit imaging.

    PubMed

    Schultz, Simon R; Copeland, Caroline S; Foust, Amanda J; Quicke, Peter; Schuck, Renaud

    2017-01-01

    Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size.

  16. Advances in two photon scanning and scanless microscopy technologies for functional neural circuit imaging

    PubMed Central

    Schultz, Simon R.; Copeland, Caroline S.; Foust, Amanda J.; Quicke, Peter; Schuck, Renaud

    2017-01-01

    Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size. PMID:28757657

  17. Enhancement of MS Signal Processing For Improved Cancer Biomarker Discovery

    NASA Astrophysics Data System (ADS)

    Si, Qian

    Technological advances in proteomics have shown great potential in detecting cancer at the earliest stages. One way is to use the time of flight mass spectroscopy to identify biomarkers, or early disease indicators related to the cancer. Pattern analysis of time of flight mass spectra data from blood and tissue samples gives great hope for the identification of potential biomarkers among the complex mixture of biological and chemical samples for the early cancer detection. One of the keys issues is the pre-processing of raw mass spectra data. A lot of challenges need to be addressed: unknown noise character associated with the large volume of data, high variability in the mass spectroscopy measurements, and poorly understood signal background and so on. This dissertation focuses on developing statistical algorithms and creating data mining tools for computationally improved signal processing for mass spectrometry data. I have introduced an advanced accurate estimate of the noise model and a half-supervised method of mass spectrum data processing which requires little knowledge about the data.

  18. SERS as a tool for in vitro toxicology.

    PubMed

    Fisher, Kate M; McLeish, Jennifer A; Jamieson, Lauren E; Jiang, Jing; Hopgood, James R; McLaughlin, Stephen; Donaldson, Ken; Campbell, Colin J

    2016-06-23

    Measuring markers of stress such as pH and redox potential are important when studying toxicology in in vitro models because they are markers of oxidative stress, apoptosis and viability. While surface enhanced Raman spectroscopy is ideally suited to the measurement of redox potential and pH in live cells, the time-intensive nature and perceived difficulty in signal analysis and interpretation can be a barrier to its broad uptake by the biological community. In this paper we detail the development of signal processing and analysis algorithms that allow SERS spectra to be automatically processed so that the output of the processing is a pH or redox potential value. By automating signal processing we were able to carry out a comparative evaluation of the toxicology of silver and zinc oxide nanoparticles and correlate our findings with qPCR analysis. The combination of these two analytical techniques sheds light on the differences in toxicology between these two materials from the perspective of oxidative stress.

  19. Automatic welding detection by an intelligent tool pipe inspection

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  20. Using stamping punch force variation for the identification of changes in lubrication and wear mechanism

    NASA Astrophysics Data System (ADS)

    Voss, B. M.; Pereira, M. P.; Rolfe, B. F.; Doolan, M. C.

    2017-09-01

    The growth in use of Advanced High Strength Steels in the automotive industry for light-weighting and safety has increased the rates of tool wear in sheet metal stamping. This is an issue that adds significant costs to production in terms of manual inspection and part refinishing. To reduce these costs, a tool condition monitoring system is required and a firm understanding of process signal variation must form the foundation for any such monitoring system. Punch force is a stamping process signal that is widely collected by industrial presses and has been linked closely to part quality and tool condition, making it an ideal candidate as a tool condition monitoring signal. In this preliminary investigation, the variation of punch force due to different lubrication conditions and progressive wear are examined. Linking specific punch force signature changes to developing lubrication and wear events is valuable for die wear and stamping condition monitoring. A series of semi-industrial channel forming trials were conducted under different lubrication regimes and progressive die wear. Punch force signatures were captured for each part and Principal Component Analysis (PCA) was applied to determine the key Principal Components of the signature data sets. These Principal Components were linked to the evolution of friction conditions over the course of the stroke for the different lubrication regimes and mechanism of galling wear. As a result, variation in punch force signatures were correlated to the current mechanism of wear dominant on the formed part; either abrasion or adhesion, and to changes in lubrication mechanism. The outcomes of this study provide important insights into punch force signature variation, that will provide a foundation for future work into the development of die wear and lubrication monitoring systems for sheet metal stamping.

  1. Two Dimensional Processing Of Speech And Ecg Signals Using The Wigner-Ville Distribution

    NASA Astrophysics Data System (ADS)

    Boashash, Boualem; Abeysekera, Saman S.

    1986-12-01

    The Wigner-Ville Distribution (WVD) has been shown to be a valuable tool for the analysis of non-stationary signals such as speech and Electrocardiogram (ECG) data. The one-dimensional real data are first transformed into a complex analytic signal using the Hilbert Transform and then a 2-dimensional image is formed using the Wigner-Ville Transform. For speech signals, a contour plot is determined and used as a basic feature. for a pattern recognition algorithm. This method is compared with the classical Short Time Fourier Transform (STFT) and is shown, to be able to recognize isolated words better in a noisy environment. The same method together with the concept of instantaneous frequency of the signal is applied to the analysis of ECG signals. This technique allows one to classify diseased heart-beat signals. Examples are shown.

  2. Acoustic emission from single point machining: Part 2, Signal changes with tool wear. Revised

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heiple, C.R.; Carpenter, S.H.; Armentrout, D.L.

    1989-12-31

    Changes in acoustic emission signal characteristics with tool wear were monitored during single point machining of 4340 steel and Ti-6Al-4V heat treated to several strength levels, 606l-T6 aluminum, 304 stainless steel, 17-4PH stainless steel, 410 stainless steel, lead, and teflon. No signal characteristic changed in the same way with tool wear for all materials tested. A single change in a particular AE signal characteristic with tool wear valid for all materials probably does not exist. Nevertheless, changes in various signal characteristic with wear for a given material may be sufficient to be used to monitor tool wear.

  3. Computational models of human vision with applications

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    Perceptual problems in aeronautics were studied. The mechanism by which color constancy is achieved in human vision was examined. A computable algorithm was developed to model the arrangement of retinal cones in spatial vision. The spatial frequency spectra are similar to the spectra of actual cone mosaics. The Hartley transform as a tool of image processing was evaluated and it is suggested that it could be used in signal processing applications, GR image processing.

  4. Surface roughness model based on force sensors for the prediction of the tool wear.

    PubMed

    de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel

    2014-04-04

    In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained.

  5. Ridge extraction from the time-frequency representation (TFR) of signals based on an image processing approach: application to the analysis of uterine electromyogram AR TFR.

    PubMed

    Terrien, Jérémy; Marque, Catherine; Germain, Guy

    2008-05-01

    Time-frequency representations (TFRs) of signals are increasingly being used in biomedical research. Analysis of such representations is sometimes difficult, however, and is often reduced to the extraction of ridges, or local energy maxima. In this paper, we describe a new ridge extraction method based on the image processing technique of active contours or snakes. We have tested our method on several synthetic signals and for the analysis of uterine electromyogram or electrohysterogram (EHG) recorded during gestation in monkeys. We have also evaluated a postprocessing algorithm that is especially suited for EHG analysis. Parameters are evaluated on real EHG signals in different gestational periods. The presented method gives good results when applied to synthetic as well as EHG signals. We have been able to obtain smaller ridge extraction errors when compared to two other methods specially developed for EHG. The gradient vector flow (GVF) snake method, or GVF-snake method, appears to be a good ridge extraction tool, which could be used on TFR of mono or multicomponent signals with good results.

  6. Analysis of the reflection of a micro drop fiber sensor

    NASA Astrophysics Data System (ADS)

    Sun, Weimin; Liu, Qiang; Zhao, Lei; Li, Yingjuan; Yuan, Libo

    2005-01-01

    Micro drop fiber sensors are effective tools for measuring characters of liquids. These types of sensors are wildly used in biotechnology, beverage and food markets. For a fiber micro drop sensor, the signal of the output light is wavy with two peaks, normally. Carefully analyzing the wavy process can identify the liquid components. Understanding the reason of forming this wavy signal is important to design a suitable sensing head and to choose a suitable signal-processing method. The dripping process of a type of liquids is relative to the characters of the liquid and the shape of the sensing head. The quasi-Gauss model of the light field from the input-fiber end is used to analyse the distribution of the light field in the liquid drop. In addition, considering the characters of the liquid to be measured, the dripping process of the optical signal from the output-fiber end can be expected. The reflection surface of the micro drop varies as serials of spheres with different radiuses and global centers. The intensity of the reflection light changes with the shape of the surface. The varying process of the intensity relates to the tense, refractive index, transmission et al. To support the analyse above, an experimental system is established. In the system, LED is chosen as the light source and the PIN transform the light signal to the electrical signal, which is collected by a data acquisition card. An on-line testing system is made to check the theory discussed above.

  7. Some uses of wavelets for imaging dynamic processes in live cochlear structures

    NASA Astrophysics Data System (ADS)

    Boutet de Monvel, J.

    2007-09-01

    A variety of image and signal processing algorithms based on wavelet filtering tools have been developed during the last few decades, that are well adapted to the experimental variability typically encountered in live biological microscopy. A number of processing tools are reviewed, that use wavelets for adaptive image restoration and for motion or brightness variation analysis by optical flow computation. The usefulness of these tools for biological imaging is illustrated in the context of the restoration of images of the inner ear and the analysis of cochlear motion patterns in two and three dimensions. I also report on recent work that aims at capturing fluorescence intensity changes associated with vesicle dynamics at synaptic zones of sensory hair cells. This latest application requires one to separate the intensity variations associated with the physiological process under study from the variations caused by motion of the observed structures. A wavelet optical flow algorithm for doing this is presented, and its effectiveness is demonstrated on artificial and experimental image sequences.

  8. CMOS-micromachined, two-dimenisional transistor arrays for neural recording and stimulation.

    PubMed

    Lin, J S; Chang, S R; Chang, C H; Lu, S C; Chen, H

    2007-01-01

    In-plane microelectrode arrays have proven to be useful tools for studying the connectivities and the functions of neural tissues. However, seldom microelectrode arrays are monolithically-integrated with signal-processing circuits, without which the maximum number of electrodes is limited by the compromise with routing complexity and interferences. This paper proposes a CMOS-compatible, two-dimensional array of oxide-semiconductor field-effect transistors(OSFETs), capable of both recording and stimulating neuronal activities. The fabrication of the OSFETs not only requires simply die-level, post-CMOS micromachining process, but also retains metal layers for monolithic integration with signal-processing circuits. A CMOS microsystem containing the OSFET arrays and gain-programmable recording circuits has been fabricated and tested. The preliminary testing results are presented and discussed.

  9. Call recognition and individual identification of fish vocalizations based on automatic speech recognition: An example with the Lusitanian toadfish.

    PubMed

    Vieira, Manuel; Fonseca, Paulo J; Amorim, M Clara P; Teixeira, Carlos J C

    2015-12-01

    The study of acoustic communication in animals often requires not only the recognition of species specific acoustic signals but also the identification of individual subjects, all in a complex acoustic background. Moreover, when very long recordings are to be analyzed, automatic recognition and identification processes are invaluable tools to extract the relevant biological information. A pattern recognition methodology based on hidden Markov models is presented inspired by successful results obtained in the most widely known and complex acoustical communication signal: human speech. This methodology was applied here for the first time to the detection and recognition of fish acoustic signals, specifically in a stream of round-the-clock recordings of Lusitanian toadfish (Halobatrachus didactylus) in their natural estuarine habitat. The results show that this methodology is able not only to detect the mating sounds (boatwhistles) but also to identify individual male toadfish, reaching an identification rate of ca. 95%. Moreover this method also proved to be a powerful tool to assess signal durations in large data sets. However, the system failed in recognizing other sound types.

  10. A molecular signaling approach to linking intraspecific variation and macro-evolutionary patterns.

    PubMed

    Swanson, Eli M; Snell-Rood, Emilie C

    2014-11-01

    Macro-evolutionary comparisons are a valued tool in evolutionary biology. Nevertheless, our understanding of how systems involved in molecular signaling change in concert with phenotypic diversification has lagged. We argue that integrating our understanding of the evolution of molecular signaling systems with phylogenetic comparative methods is an important step toward understanding the processes linking variation among individuals with variation among species. Focusing mostly on the endocrine system, we discuss how the complexity and mechanistic nature of molecular signaling systems may influence the application and interpretation of macro-evolutionary comparisons. We also detail five hypotheses concerning the role that physiological mechanisms can play in shaping macro-evolutionary patterns, and discuss ways in which these hypotheses could influence phenotypic diversification. Finally, we review a series of tools able to analyze the complexity of physiological systems and the way they change in concert with the phenotypes for which they coordinate development. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  11. Trifunctional lipid probes for comprehensive studies of single lipid species in living cells

    PubMed Central

    Nadler, André; Haberkant, Per; Kirkpatrick, Joanna; Schifferer, Martina; Stein, Frank; Hauke, Sebastian; Porter, Forbes D.; Schultz, Carsten

    2017-01-01

    Lipid-mediated signaling events regulate many cellular processes. Investigations of the complex underlying mechanisms are difficult because several different methods need to be used under varying conditions. Here we introduce multifunctional lipid derivatives to study lipid metabolism, lipid−protein interactions, and intracellular lipid localization with a single tool per target lipid. The probes are equipped with two photoreactive groups to allow photoliberation (uncaging) and photo–cross-linking in a sequential manner, as well as a click-handle for subsequent functionalization. We demonstrate the versatility of the design for the signaling lipids sphingosine and diacylglycerol; uncaging of the probe for these two species triggered calcium signaling and intracellular protein translocation events, respectively. We performed proteomic screens to map the lipid-interacting proteome for both lipids. Finally, we visualized a sphingosine transport deficiency in patient-derived Niemann−Pick disease type C fibroblasts by fluorescence as well as correlative light and electron microscopy, pointing toward the diagnostic potential of such tools. We envision that this type of probe will become important for analyzing and ultimately understanding lipid signaling events in a comprehensive manner. PMID:28154130

  12. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  13. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    NASA Astrophysics Data System (ADS)

    Ritou, M.; Garnier, S.; Furet, B.; Hascoet, J. Y.

    2014-02-01

    The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach.Then, a new solution is proposed for the estimate of cutting force using eddy current sensors implemented close to spindle nose. Signals are analysed in the angular domain, notably by synchronous averaging technique. Phase shifts induced by changes of machining direction are compensated. Results are compared with cutting forces measured with a dynamometer table.The proposed method is implemented in an industrial case of pocket machining operation. One of the cutting edges has been slightly damaged during the machining, as shown by a direct measurement of the tool. A control chart is established with the estimates of cutter eccentricity obtained during the machining from the eddy current sensors signals. Efficiency and reliability of the method is demonstrated by a successful detection of the damage.

  14. In-situ sensing using mass spectrometry and its use for run-to-run control on a W-CVD cluster tool

    NASA Astrophysics Data System (ADS)

    Gougousi, T.; Sreenivasan, R.; Xu, Y.; Henn-Lecordier, L.; Rubloff, G. W.; Kidder, , J. N.; Zafiriou, E.

    2001-01-01

    A 300 amu closed-ion-source RGA (Leybold-Inficon Transpector 2) sampling gases directly from the reactor of an ULVAC ERA-1000 cluster tool has been used for real time process monitoring of a W CVD process. The process involves H2 reduction of WF6 at a total pressure of 67 Pa (0.5 torr) to produce W films on Si wafers heated at temperatures around 350 °C. The normalized RGA signals for the H2 reagent depletion and the HF product generation were correlated with the W film weight as measured post-process with an electronic microbalance for the establishment of thin-film weight (thickness) metrology. The metrology uncertainty (about 7% for the HF product) was limited primarily by the very low conversion efficiency of the W CVD process (around 2-3%). The HF metrology was then used to drive a robust run-to-run control algorithm, with the deposition time selected as the manipulated (or controlled) variable. For that purpose, during a 10 wafer run, a systematic process drift was introduced as a -5 °C processing temperature change for each successive wafer, in an otherwise unchanged process recipe. Without adjustment of the deposition time the W film weight (thickness) would have declined by about 50% by the 10th wafer. With the aid of the process control algorithm, an adjusted deposition time was computed so as to maintain constant HF sensing signal, resulting in weight (thickness) control comparable to the accuracy of the thickness metrology. These results suggest that in-situ chemical sensing, and particularly mass spectrometry, provide the basis for wafer state metrology as needed to achieve run-to-run control. Furthermore, since the control accuracy was consistent with the metrology accuracy, we anticipate significant improvements for processes as used in manufacturing, where conversion rates are much higher (40-50%) and corresponding signals for metrology will be much larger.

  15. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions

    PubMed Central

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression. PMID:25206321

  16. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions.

    PubMed

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600-700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression.

  17. Reproducible and sustained regulation of Gαs signalling using a metazoan opsin as an optogenetic tool.

    PubMed

    Bailes, Helena J; Zhuang, Ling-Yu; Lucas, Robert J

    2012-01-01

    Originally developed to regulate neuronal excitability, optogenetics is increasingly also used to control other cellular processes with unprecedented spatiotemporal resolution. Optogenetic modulation of all major G-protein signalling pathways (Gq, Gi and Gs) has been achieved using variants of mammalian rod opsin. We show here that the light response driven by such rod opsin-based tools dissipates under repeated exposure, consistent with the known bleaching characteristics of this photopigment. We continue to show that replacing rod opsin with a bleach resistant opsin from Carybdea rastonii, the box jellyfish, (JellyOp) overcomes this limitation. Visible light induced high amplitude, reversible, and reproducible increases in cAMP in mammalian cells expressing JellyOp. While single flashes produced a brief cAMP spike, repeated stimulation could sustain elevated levels for 10s of minutes. JellyOp was more photosensitive than currently available optogenetic tools, responding to white light at irradiances ≥1 µW/cm(2). We conclude that JellyOp is a promising new tool for mimicking the activity of Gs-coupled G protein coupled receptors with fine spatiotemporal resolution.

  18. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Implementation of a Portable Personal EKG Signal Monitoring System

    NASA Astrophysics Data System (ADS)

    Tan, Tan-Hsu; Chang, Ching-Su; Chen, Yung-Fu; Lee, Cheng

    This research develops a portable personal EKG signal monitoring system to help patients monitor their EKG signals instantly to avoid the occurrence of tragedies. This system is built with two main units: signal pro-cessing unit and monitoring and evaluation unit. The first unit consists of EKG signal sensor, signal amplifier, digitalization circuit, and related control circuits. The second unit is a software tool developed on an embedded Linux platform (called CSA). Experimental result indicates that the proposed system has the practical potential for users in health monitoring. It is demonstrated to be more convenient and with greater portability than the conventional PC-based EKG signal monitoring systems. Furthermore, all the application units embedded in the system are built with open source codes, no licensed fee is required for operating systems and authorized applications. Thus, the building cost is much lower than the traditional systems.

  20. Brain Tissue Responses to Neural Implants Impact Signal Sensitivity and Intervention Strategies

    PubMed Central

    2015-01-01

    Implantable biosensors are valuable scientific tools for basic neuroscience research and clinical applications. Neurotechnologies provide direct readouts of neurological signal and neurochemical processes. These tools are generally most valuable when performance capacities extend over months and years to facilitate the study of memory, plasticity, and behavior or to monitor patients’ conditions. These needs have generated a variety of device designs from microelectrodes for fast scan cyclic voltammetry (FSCV) and electrophysiology to microdialysis probes for sampling and detecting various neurochemicals. Regardless of the technology used, the breaching of the blood–brain barrier (BBB) to insert devices triggers a cascade of biochemical pathways resulting in complex molecular and cellular responses to implanted devices. Molecular and cellular changes in the microenvironment surrounding an implant include the introduction of mechanical strain, activation of glial cells, loss of perfusion, secondary metabolic injury, and neuronal degeneration. Changes to the tissue microenvironment surrounding the device can dramatically impact electrochemical and electrophysiological signal sensitivity and stability over time. This review summarizes the magnitude, variability, and time course of the dynamic molecular and cellular level neural tissue responses induced by state-of-the-art implantable devices. Studies show that insertion injuries and foreign body response can impact signal quality across all implanted central nervous system (CNS) sensors to varying degrees over both acute (seconds to minutes) and chronic periods (weeks to months). Understanding the underlying biological processes behind the brain tissue response to the devices at the cellular and molecular level leads to a variety of intervention strategies for improving signal sensitivity and longevity. PMID:25546652

  1. A new algorithm for epilepsy seizure onset detection and spread estimation from EEG signals

    NASA Astrophysics Data System (ADS)

    Quintero-Rincón, Antonio; Pereyra, Marcelo; D'Giano, Carlos; Batatia, Hadj; Risk, Marcelo

    2016-04-01

    Appropriate diagnosis and treatment of epilepsy is a main public health issue. Patients suffering from this disease often exhibit different physical characterizations, which result from the synchronous and excessive discharge of a group of neurons in the cerebral cortex. Extracting this information using EEG signals is an important problem in biomedical signal processing. In this work we propose a new algorithm for seizure onset detection and spread estimation in epilepsy patients. The algorithm is based on a multilevel 1-D wavelet decomposition that captures the physiological brain frequency signals coupled with a generalized gaussian model. Preliminary experiments with signals from 30 epilepsy crisis and 11 subjects, suggest that the proposed methodology is a powerful tool for detecting the onset of epilepsy seizures with his spread across the brain.

  2. Unsupervised pattern recognition methods in ciders profiling based on GCE voltammetric signals.

    PubMed

    Jakubowska, Małgorzata; Sordoń, Wanda; Ciepiela, Filip

    2016-07-15

    This work presents a complete methodology of distinguishing between different brands of cider and ageing degrees, based on voltammetric signals, utilizing dedicated data preprocessing procedures and unsupervised multivariate analysis. It was demonstrated that voltammograms recorded on glassy carbon electrode in Britton-Robinson buffer at pH 2 are reproducible for each brand. By application of clustering algorithms and principal component analysis visible homogenous clusters were obtained. Advanced signal processing strategy which included automatic baseline correction, interval scaling and continuous wavelet transform with dedicated mother wavelet, was a key step in the correct recognition of the objects. The results show that voltammetry combined with optimized univariate and multivariate data processing is a sufficient tool to distinguish between ciders from various brands and to evaluate their freshness. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A review of the semiconductor storage of television signals. Part 2: Applications 1975-1986

    NASA Astrophysics Data System (ADS)

    Riley, J. L.

    1987-08-01

    This is the second of two reports. In the first, the emerging semiconductor memory technology over the last two decades and some of the important operational characteristics of each ensuing generation of device are described together with the design philosophy for forming the devices into useful tools for the storage of television signals. The second of these reports describes some of the applications. These include improved television synchronizers, high quality PAL decoders, television noise reducers, film dirt concealment equipment and buffer storage for television picture processing equipment such as stills stores. The continuing developments in the technology promise still further increases of memory capacity and there is a proposal to build a mass semiconductor television picture sequence store, initially as a research tool.

  4. Adaptive noise cancelling and time-frequency techniques for rail surface defect detection

    NASA Astrophysics Data System (ADS)

    Liang, B.; Iwnicki, S.; Ball, A.; Young, A. E.

    2015-03-01

    Adaptive noise cancelling (ANC) is a technique which is very effective to remove additive noises from the contaminated signals. It has been widely used in the fields of telecommunication, radar and sonar signal processing. However it was seldom used for the surveillance and diagnosis of mechanical systems before late of 1990s. As a promising technique it has gradually been exploited for the purpose of condition monitoring and fault diagnosis. Time-frequency analysis is another useful tool for condition monitoring and fault diagnosis purpose as time-frequency analysis can keep both time and frequency information simultaneously. This paper presents an ANC and time-frequency application for railway wheel flat and rail surface defect detection. The experimental results from a scaled roller test rig show that this approach can significantly reduce unwanted interferences and extract the weak signals from strong background noises. The combination of ANC and time-frequency analysis may provide us one of useful tools for condition monitoring and fault diagnosis of railway vehicles.

  5. System technology for laser-assisted milling with tool integrated optics

    NASA Astrophysics Data System (ADS)

    Hermani, Jan-Patrick; Emonts, Michael; Brecher, Christian

    2013-02-01

    High strength metal alloys and ceramics offer a huge potential for increased efficiency (e. g. in engine components for aerospace or components for gas turbines). However, mass application is still hampered by cost- and time-consuming end-machining due to long processing times and high tool wear. Laser-induced heating shortly before machining can reduce the material strength and improve machinability significantly. The Fraunhofer IPT has developed and successfully realized a new approach for laser-assisted milling with spindle and tool integrated, co-rotating optics. The novel optical system inside the tool consists of one deflection prism to position the laser spot in front of the cutting insert and one focusing lens. Using a fiber laser with high beam quality the laser spot diameter can be precisely adjusted to the chip size. A high dynamic adaption of the laser power signal according to the engagement condition of the cutting tool was realized in order not to irradiate already machined work piece material. During the tool engagement the laser power is controlled in proportion to the current material removal rate, which has to be calculated continuously. The needed geometric values are generated by a CAD/CAM program and converted into a laser power signal by a real-time controller. The developed milling tool with integrated optics and the algorithm for laser power control enable a multi-axis laser-assisted machining of complex parts.

  6. Vector coding of wavelet-transformed images

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Zhi, Cheng; Zhou, Yuanhua

    1998-09-01

    Wavelet, as a brand new tool in signal processing, has got broad recognition. Using wavelet transform, we can get octave divided frequency band with specific orientation which combines well with the properties of Human Visual System. In this paper, we discuss the classified vector quantization method for multiresolution represented image.

  7. Dataflow Integration and Simulation Techniques for DSP System Design Tools

    DTIC Science & Technology

    2007-01-01

    Lebak, M. Richards , and D. Campbell, “VSIPL: An object-based open standard API for vector, signal, and image processing,” in Proceedings of the...Inc., document Version 0.98a. [56] P. Marwedel and G. Goossens , Eds., Code Generation for Embedded Processors. Kluwer Academic Publishers, 1995. [57

  8. A comparative study of the svm and k-nn machine learning algorithms for the diagnosis of respiratory pathologies using pulmonary acoustic signals

    PubMed Central

    2014-01-01

    Background Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database. Results The pulmonary acoustic signals used in this study were obtained from the R.A.L.E lung sound database. The pulmonary acoustic signals were manually categorised into three different groups, namely normal, airway obstruction pathology, and parenchymal pathology. The mel-frequency cepstral coefficient (MFCC) features were extracted from the pre-processed pulmonary acoustic signals. The MFCC features were analysed by one-way ANOVA and then fed separately into the SVM and K-nn classifiers. The performances of the classifiers were analysed using the confusion matrix technique. The statistical analysis of the MFCC features using one-way ANOVA showed that the extracted MFCC features are significantly different (p < 0.001). The classification accuracies of the SVM and K-nn classifiers were found to be 92.19% and 98.26%, respectively. Conclusion Although the data used to train and test the classifiers are limited, the classification accuracies found are satisfactory. The K-nn classifier was better than the SVM classifier for the discrimination of pulmonary acoustic signals from pathological and normal subjects obtained from the RALE database. PMID:24970564

  9. A comparative study of the SVM and K-nn machine learning algorithms for the diagnosis of respiratory pathologies using pulmonary acoustic signals.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-06-27

    Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database. The pulmonary acoustic signals used in this study were obtained from the R.A.L.E lung sound database. The pulmonary acoustic signals were manually categorised into three different groups, namely normal, airway obstruction pathology, and parenchymal pathology. The mel-frequency cepstral coefficient (MFCC) features were extracted from the pre-processed pulmonary acoustic signals. The MFCC features were analysed by one-way ANOVA and then fed separately into the SVM and K-nn classifiers. The performances of the classifiers were analysed using the confusion matrix technique. The statistical analysis of the MFCC features using one-way ANOVA showed that the extracted MFCC features are significantly different (p < 0.001). The classification accuracies of the SVM and K-nn classifiers were found to be 92.19% and 98.26%, respectively. Although the data used to train and test the classifiers are limited, the classification accuracies found are satisfactory. The K-nn classifier was better than the SVM classifier for the discrimination of pulmonary acoustic signals from pathological and normal subjects obtained from the RALE database.

  10. Image Processing and Computer Aided Diagnosis in Computed Tomography of the Breast

    DTIC Science & Technology

    2007-03-01

    TERMS breast imaging, breast CT, scatter compensation, denoising, CAD , Cone-beam CT 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...clinical projection images. The CAD tool based on signal known exactly (SKE) scenario is under development. Task 6: Test and compare the...performances of the CAD developed in Task 5 applied to processed projection data from Task 1 with the CAD performance on the projection data without Bayesian

  11. Improving the All-Hazards Homeland Security Enterprise Through the Use of an Emergency Management Intelligence Model

    DTIC Science & Technology

    2013-09-01

    Office of the Inspector General OSINT Open Source Intelligence PPD Presidential Policy Directive SIGINT Signals Intelligence SLFC State/Local Fusion...Geospatial Intelligence (GEOINT) from Geographic Information Systems (GIS), and Open Source Intelligence ( OSINT ) from Social Media. GIS is widely...and monitor make it a feasible tool to capitalize on for OSINT . A formalized EM intelligence process would help expedite the processing of such

  12. Using fMRI to study reward processing in humans: past, present, and future

    PubMed Central

    Wang, Kainan S.; Smith, David V.

    2016-01-01

    Functional magnetic resonance imaging (fMRI) is a noninvasive tool used to probe cognitive and affective processes. Although fMRI provides indirect measures of neural activity, the advent of fMRI has allowed for 1) the corroboration of significant animal findings in the human brain, and 2) the expansion of models to include more common human attributes that inform behavior. In this review, we briefly consider the neural basis of the blood oxygenation level dependent signal to set up a discussion of how fMRI studies have applied it in examining cognitive models in humans and the promise of using fMRI to advance such models. Specifically, we illustrate the contribution that fMRI has made to the study of reward processing, focusing on the role of the striatum in encoding reward-related learning signals that drive anticipatory and consummatory behaviors. For instance, we discuss how fMRI can be used to link neural signals (e.g., striatal responses to rewards) to individual differences in behavior and traits. While this functional segregation approach has been constructive to our understanding of reward-related functions, many fMRI studies have also benefitted from a functional integration approach that takes into account how interconnected regions (e.g., corticostriatal circuits) contribute to reward processing. We contend that future work using fMRI will profit from using a multimodal approach, such as combining fMRI with noninvasive brain stimulation tools (e.g., transcranial electrical stimulation), that can identify causal mechanisms underlying reward processing. Consequently, advancements in implementing fMRI will promise new translational opportunities to inform our understanding of psychopathologies. PMID:26740530

  13. Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument

    NASA Astrophysics Data System (ADS)

    DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.

    2008-08-01

    The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.

  14. Separation of Intercepted Multi-Radar Signals Based on Parameterized Time-Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Lu, W. L.; Xie, J. W.; Wang, H. M.; Sheng, C.

    2016-09-01

    Modern radars use complex waveforms to obtain high detection performance and low probabilities of interception and identification. Signals intercepted from multiple radars overlap considerably in both the time and frequency domains and are difficult to separate with primary time parameters. Time-frequency analysis (TFA), as a key signal-processing tool, can provide better insight into the signal than conventional methods. In particular, among the various types of TFA, parameterized time-frequency analysis (PTFA) has shown great potential to investigate the time-frequency features of such non-stationary signals. In this paper, we propose a procedure for PTFA to separate overlapped radar signals; it includes five steps: initiation, parameterized time-frequency analysis, demodulating the signal of interest, adaptive filtering and recovering the signal. The effectiveness of the method was verified with simulated data and an intercepted radar signal received in a microwave laboratory. The results show that the proposed method has good performance and has potential in electronic reconnaissance applications, such as electronic intelligence, electronic warfare support measures, and radar warning.

  15. Asymptotic Cramer-Rao bounds for Morlet wavelet filter bank transforms of FM signals

    NASA Astrophysics Data System (ADS)

    Scheper, Richard

    2002-03-01

    Wavelet filter banks are potentially useful tools for analyzing and extracting information from frequency modulated (FM) signals in noise. Chief among the advantages of such filter banks is the tendency of wavelet transforms to concentrate signal energy while simultaneously dispersing noise energy over the time-frequency plane, thus raising the effective signal to noise ratio of filtered signals. Over the past decade, much effort has gone into devising new algorithms to extract the relevant information from transformed signals while identifying and discarding the transformed noise. Therefore, estimates of the ultimate performance bounds on such algorithms would serve as valuable benchmarks in the process of choosing optimal algorithms for given signal classes. Discussed here is the specific case of FM signals analyzed by Morlet wavelet filter banks. By making use of the stationary phase approximation of the Morlet transform, and assuming that the measured signals are well resolved digitally, the asymptotic form of the Fisher Information Matrix is derived. From this, Cramer-Rao bounds are analytically derived for simple cases.

  16. New instrument for on-line viscosity measurement of fermentation media.

    PubMed

    Picque, D; Corrieu, G

    1988-01-01

    In an attempt to resolve the difficult problem of on-line determination of the viscosity of non-Newtonian fermentation media, the authors have used a vibrating rod sensor mounted on a bioreactor. The sensor signal decreases nonlinearly with increased apparent viscosity. Electronic filtering of the signal damps the interfering effect of aeration and mechanical agitation. Sensor drift is very low (0.03% of measured value per hour). On the rheological level the sensor is primarily an empirical tool that must be specifically calibrated for each fermentation process. Once this is accomplished, it becomes possible to establish linear or second-degree correlations between the electrical signal from the sensor and the essential parameters of the fermentation process in question (pH of a fermented milk during acidification, concentration of extra cellular polysaccharide). In addition, by applying the power law to describe the rheological behavior of fermentation media, we observe a second-order polynomial correlation between the sensor signal and the behavior index (n).

  17. Use of Machine Learning Classifiers and Sensor Data to Detect Neurological Deficit in Stroke Patients.

    PubMed

    Park, Eunjeong; Chang, Hyuk-Jae; Nam, Hyo Suk

    2017-04-18

    The pronator drift test (PDT), a neurological examination, is widely used in clinics to measure motor weakness of stroke patients. The aim of this study was to develop a PDT tool with machine learning classifiers to detect stroke symptoms based on quantification of proximal arm weakness using inertial sensors and signal processing. We extracted features of drift and pronation from accelerometer signals of wearable devices on the inner wrists of 16 stroke patients and 10 healthy controls. Signal processing and feature selection approach were applied to discriminate PDT features used to classify stroke patients. A series of machine learning techniques, namely support vector machine (SVM), radial basis function network (RBFN), and random forest (RF), were implemented to discriminate stroke patients from controls with leave-one-out cross-validation. Signal processing by the PDT tool extracted a total of 12 PDT features from sensors. Feature selection abstracted the major attributes from the 12 PDT features to elucidate the dominant characteristics of proximal weakness of stroke patients using machine learning classification. Our proposed PDT classifiers had an area under the receiver operating characteristic curve (AUC) of .806 (SVM), .769 (RBFN), and .900 (RF) without feature selection, and feature selection improves the AUCs to .913 (SVM), .956 (RBFN), and .975 (RF), representing an average performance enhancement of 15.3%. Sensors and machine learning methods can reliably detect stroke signs and quantify proximal arm weakness. Our proposed solution will facilitate pervasive monitoring of stroke patients. ©Eunjeong Park, Hyuk-Jae Chang, Hyo Suk Nam. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 18.04.2017.

  18. Comparative muscle study fatigue with sEMG signals during the isotonic and isometric tasks for diagnostics purposes.

    PubMed

    Sarmiento, Jhon F; Benevides, Alessandro B; Moreira, Marcelo H; Elias, Arlindo; Bastos, Teodiano F; Silva, Ian V; Pelegrina, Claudinei C

    2011-01-01

    The study of fatigue is an important tool for diagnostics of disease, sports, ergonomics and robotics areas. This work deals with the analysis of sEMG most important fatigue muscle indicators with use of signal processing in isometric and isotonic tasks with the propose of standardizing fatigue protocol to select the data acquisition and processing with diagnostic proposes. As a result, the slope of the RMS, ARV and MNF indicators were successful to describe the fatigue behavior expected. Whereas that, MDF and AIF indicators failed in the description of fatigue. Similarly, the use of a constant load for sEMG data acquisition was the best strategy in both tasks.

  19. Power-law statistics of neurophysiological processes analyzed using short signals

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.

    2018-04-01

    We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.

  20. Role of the ceramide-signaling pathways in ionizing radiation-induced apoptosis.

    PubMed

    Vit, Jean-Philippe; Rosselli, Filippo

    2003-11-27

    Ionizing radiations (IR) exposure leads to damage on several cellular targets. How signals from different targets are integrated to determine the cell fate remains a controversial issue. Understanding the pathway(s) responsible(s) for the cell killing effect of the IR exposure is of prime importance in light of using radiations as anticancer agent or as diagnostic tool. In this study, we have established that IR-induced cell damage initiates two independent signaling pathways that lead to a biphasic intracellular ceramide increase. A transitory increase of ceramide is observed within minutes after IR exposure as a consequence of DNA damage-independent acid sphingomyelinase activation. Several hours after irradiation, a second wave of ceramide accumulation is observed depending on the DNA damage-dependent activation of ceramide synthase, which requires a signaling pathway involving ATM. Importantly, we have demonstrated that the late ceramide accumulation is also dependent on the first one and is rate limiting for the apoptotic process induced by IR. In conclusion, our observations suggest that ceramide is a major determinant of the IR-induced apoptotic process at the cross-point of different signal transduction pathways.

  1. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  2. Dynamic ultrasonic contact detection using acoustic emissions.

    PubMed

    Turner, S L; Rabani, A; Axinte, D A; King, C W

    2014-03-01

    For a non-contact ultrasonic material removal process, the control of the standoff position can be crucial to process performance; particularly where the requirement is for a standoff of the order of <20 μm. The standoff distance relative to the surface to be machined can be set by first contacting the ultrasonic tool tip with the surface and then withdrawing the tool to the required position. Determination of this contact point in a dynamic system at ultrasonic frequencies (>20 kHz) is achieved by force measurement or by detection of acoustic emissions (AE). However, where detection of distance from a surface must be determined without contact taking place, an alternative method must be sought. In this paper, the effect of distance from contact of an ultrasonic tool is measured by detection of AE through the workpiece. At the point of contact, the amplitude of the signal at the fundamental frequency increases significantly, but the strength of the 2nd and 3rd harmonic signals increases more markedly. Closer examination of these harmonics shows that an increase in their intensities can be observed in the 10 μm prior to contact, providing a mechanism to detect near contact (<10 μm) without the need to first contact the surface in order to set a standoff. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  4. Ku-band signal design study. [space shuttle orbiter data processing network

    NASA Technical Reports Server (NTRS)

    Rubin, I.

    1978-01-01

    Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.

  5. Review of Sparse Representation-Based Classification Methods on EEG Signal Processing for Epilepsy Detection, Brain-Computer Interface and Cognitive Impairment

    PubMed Central

    Wen, Dong; Jia, Peilei; Lian, Qiusheng; Zhou, Yanhong; Lu, Chengbiao

    2016-01-01

    At present, the sparse representation-based classification (SRC) has become an important approach in electroencephalograph (EEG) signal analysis, by which the data is sparsely represented on the basis of a fixed dictionary or learned dictionary and classified based on the reconstruction criteria. SRC methods have been used to analyze the EEG signals of epilepsy, cognitive impairment and brain computer interface (BCI), which made rapid progress including the improvement in computational accuracy, efficiency and robustness. However, these methods have deficiencies in real-time performance, generalization ability and the dependence of labeled sample in the analysis of the EEG signals. This mini review described the advantages and disadvantages of the SRC methods in the EEG signal analysis with the expectation that these methods can provide the better tools for analyzing EEG signals. PMID:27458376

  6. The structural basis of arrestin-mediated regulation of G-protein-coupled receptors

    PubMed Central

    Gurevich, Vsevolod V.; Gurevich, Eugenia V.

    2008-01-01

    The 4 mammalian arrestins serve as almost universal regulators of the largest known family of signaling proteins, G-protein-coupled receptors (GPCRs). Arrestins terminate receptor interactions with G proteins, redirect the signaling to a variety of alternative pathways, and orchestrate receptor internalization and subsequent intracellular trafficking. The elucidation of the structural basis and fine molecular mechanisms of the arrestin–receptor interaction paved the way to the targeted manipulation of this interaction from both sides to produce very stable or extremely transient complexes that helped to understand the regulation of many biologically important processes initiated by active GPCRs. The elucidation of the structural basis of arrestin interactions with numerous non-receptor-binding partners is long overdue. It will allow the construction of fully functional arrestins in which the ability to interact with individual partners is specifically disrupted or enhanced by targeted mutagenesis. These “custom-designed” arrestin mutants will be valuable tools in defining the role of various interactions in the intricate interplay of multiple signaling pathways in the living cell. The identification of arrestin-binding sites for various signaling molecules will also set the stage for designing molecular tools for therapeutic intervention that may prove useful in numerous disorders associated with congenital or acquired disregulation of GPCR signaling. PMID:16460808

  7. Detecting the spatial chirp signals by fractional Fourier lens with transformation materials

    NASA Astrophysics Data System (ADS)

    Chen, J.; Hu, J.

    2018-02-01

    Fractional Fourier transform (FrFT) is the general form of the Fourier transform and is an important tool in signal processing. As one typical application of FrFT, detecting the chirp rate (CR, or known as the rate of frequency change) of a chirp signal is important in many optical measurements. The optical FrFT that based on graded index lens fails to detect the high CR chirp because the short wave propagation distance of the impulse in the lens will weaken the paraxial approximation condition. With the help of transformation optics, the improved FrFT lens is proposed to adjust the high CR as well as the impulse location of the given input chirp signal. The designed transformation materials can implement the effect of space compression, making the input chirp signal is equivalent to have lower CR, therefore the system can satisfy the paraxial approximation better. As a result, this lens can improve the detection precision for the high CR. The numerical simulations verified the design. The proposed device may have both theoretical and practical values, and the design demonstrates the ability and flexibility of TO in spatial signal processing.

  8. Discrete Logic Modelling Optimization to Contextualize Prior Knowledge Networks Using PRUNET

    PubMed Central

    Androsova, Ganna; del Sol, Antonio

    2015-01-01

    High-throughput technologies have led to the generation of an increasing amount of data in different areas of biology. Datasets capturing the cell’s response to its intra- and extra-cellular microenvironment allows such data to be incorporated as signed and directed graphs or influence networks. These prior knowledge networks (PKNs) represent our current knowledge of the causality of cellular signal transduction. New signalling data is often examined and interpreted in conjunction with PKNs. However, different biological contexts, such as cell type or disease states, may have distinct variants of signalling pathways, resulting in the misinterpretation of new data. The identification of inconsistencies between measured data and signalling topologies, as well as the training of PKNs using context specific datasets (PKN contextualization), are necessary conditions to construct reliable, predictive models, which are current challenges in the systems biology of cell signalling. Here we present PRUNET, a user-friendly software tool designed to address the contextualization of a PKNs to specific experimental conditions. As the input, the algorithm takes a PKN and the expression profile of two given stable steady states or cellular phenotypes. The PKN is iteratively pruned using an evolutionary algorithm to perform an optimization process. This optimization rests in a match between predicted attractors in a discrete logic model (Boolean) and a Booleanized representation of the phenotypes, within a population of alternative subnetworks that evolves iteratively. We validated the algorithm applying PRUNET to four biological examples and using the resulting contextualized networks to predict missing expression values and to simulate well-characterized perturbations. PRUNET constitutes a tool for the automatic curation of a PKN to make it suitable for describing biological processes under particular experimental conditions. The general applicability of the implemented algorithm makes PRUNET suitable for a variety of biological processes, for instance cellular reprogramming or transitions between healthy and disease states. PMID:26058016

  9. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    PubMed Central

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-01-01

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322

  10. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    PubMed

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  11. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  12. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  13. Integrated flexible manufacturing program for manufacturing automation and rapid prototyping

    NASA Technical Reports Server (NTRS)

    Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.

    1993-01-01

    The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.

  14. Surface Roughness Model Based on Force Sensors for the Prediction of the Tool Wear

    PubMed Central

    de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel

    2014-01-01

    In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained. PMID:24714391

  15. Methods to Manipulate and Monitor Wnt Signaling in Human Pluripotent Stem Cells.

    PubMed

    Huggins, Ian J; Brafman, David; Willert, Karl

    2016-01-01

    Human pluripotent stem cells (hPSCs) may revolutionize medical practice by providing: (a) a renewable source of cells for tissue replacement therapies, (b) a powerful system to model human diseases in a dish, and (c) a platform for examining efficacy and safety of novel drugs. Furthermore, these cells offer a unique opportunity to study early human development in vitro, in particular, the process by which a seemingly uniform cell population interacts to give rise to the three main embryonic lineages: ectoderm, endoderm. and mesoderm. This process of lineage allocation is regulated by a number of inductive signals that are mediated by growth factors, including FGF, TGFβ, and Wnt. In this book chapter, we introduce a set of tools, methods, and protocols to specifically manipulate the Wnt signaling pathway with the intention of altering the cell fate outcome of hPSCs.

  16. The TryPIKinome of five human pathogenic trypanosomatids: Trypanosoma brucei, Trypanosoma cruzi, Leishmania major, Leishmania braziliensis and Leishmania infantum--new tools for designing specific inhibitors.

    PubMed

    Bahia, Diana; Oliveira, Luciana Márcia; Lima, Fabio Mitsuo; Oliveira, Priscila; Silveira, José Franco da; Mortara, Renato Arruda; Ruiz, Jerônimo Conceição

    2009-12-18

    Phosphatidylinositol (PI) kinases are at the heart of one of the major pathways of intracellular signal transduction. Herein, we present the first report on a survey made by similarity searches against the five human pathogenic trypanosomatids Trypanosoma brucei, Trypanosoma cruzi, Leishmania major, Leishmania braziliensis and Leishmania infantum genomes available to date for phosphatidylinositol- and related-kinases (TryPIKs). In addition to generating a panel called "The TryPIKinome", we propose a model of signaling pathways for these TryPIKs. The involvement of TryPIKs in fundamental pathways, such as intracellular signal transduction and host invasion processes, makes the study of TryPIKs an important area for further inquiry. New subtype-specific inhibitors are expected to work on individual members of the PIK family and, therefore, can presumably neutralize trypanosomatid invasion processes.

  17. a Universal De-Noising Algorithm for Ground-Based LIDAR Signal

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Xiang, Chengzhi; Gong, Wei

    2016-06-01

    Ground-based lidar, working as an effective remote sensing tool, plays an irreplaceable role in the study of atmosphere, since it has the ability to provide the atmospheric vertical profile. However, the appearance of noise in a lidar signal is unavoidable, which leads to difficulties and complexities when searching for more information. Every de-noising method has its own characteristic but with a certain limitation, since the lidar signal will vary with the atmosphere changes. In this paper, a universal de-noising algorithm is proposed to enhance the SNR of a ground-based lidar signal, which is based on signal segmentation and reconstruction. The signal segmentation serving as the keystone of the algorithm, segments the lidar signal into three different parts, which are processed by different de-noising method according to their own characteristics. The signal reconstruction is a relatively simple procedure that is to splice the signal sections end to end. Finally, a series of simulation signal tests and real dual field-of-view lidar signal shows the feasibility of the universal de-noising algorithm.

  18. Quantification of the power changes in BOLD signals using Welch spectrum method during different single-hand motor imageries.

    PubMed

    Zhang, Jiang; Yuan, Zhen; Huang, Jin; Yang, Qin; Chen, Huafu

    2014-12-01

    Motor imagery is an experimental paradigm implemented in cognitive neuroscience and cognitive psychology. To investigate the asymmetry of the strength of cortical functional activity due to different single-hand motor imageries, functional magnetic resonance imaging (fMRI) data from right handed normal subjects were recorded and analyzed during both left-hand and right-hand motor imagery processes. Then the average power of blood oxygenation level-dependent (BOLD) signals in temporal domain was calculated using the developed tool that combines Welch power spectrum and the integral of power spectrum approach of BOLD signal changes during motor imagery. Power change analysis results indicated that cortical activity exhibited a stronger power in the precentral gyrus and medial frontal gyrus with left-hand motor imagery tasks compared with that from right-hand motor imagery tasks. These observations suggest that right handed normal subjects mobilize more cortical nerve cells for left-hand motor imagery. Our findings also suggest that the approach based on power differences of BOLD signals is a suitable quantitative analysis tool for quantification of asymmetry of brain activity intensity during motor imagery tasks. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Discovering relationships between nuclear receptor signaling pathways, genes, and tissues in Transcriptomine.

    PubMed

    Becnel, Lauren B; Ochsner, Scott A; Darlington, Yolanda F; McOwiti, Apollo; Kankanamge, Wasula H; Dehart, Michael; Naumov, Alexey; McKenna, Neil J

    2017-04-25

    We previously developed a web tool, Transcriptomine, to explore expression profiling data sets involving small-molecule or genetic manipulations of nuclear receptor signaling pathways. We describe advances in biocuration, query interface design, and data visualization that enhance the discovery of uncharacterized biology in these pathways using this tool. Transcriptomine currently contains about 45 million data points encompassing more than 2000 experiments in a reference library of nearly 550 data sets retrieved from public archives and systematically curated. To make the underlying data points more accessible to bench biologists, we classified experimental small molecules and gene manipulations into signaling pathways and experimental tissues and cell lines into physiological systems and organs. Incorporation of these mappings into Transcriptomine enables the user to readily evaluate tissue-specific regulation of gene expression by nuclear receptor signaling pathways. Data points from animal and cell model experiments and from clinical data sets elucidate the roles of nuclear receptor pathways in gene expression events accompanying various normal and pathological cellular processes. In addition, data sets targeting non-nuclear receptor signaling pathways highlight transcriptional cross-talk between nuclear receptors and other signaling pathways. We demonstrate with specific examples how data points that exist in isolation in individual data sets validate each other when connected and made accessible to the user in a single interface. In summary, Transcriptomine allows bench biologists to routinely develop research hypotheses, validate experimental data, or model relationships between signaling pathways, genes, and tissues. Copyright © 2017, American Association for the Advancement of Science.

  20. Optimal visual-haptic integration with articulated tools.

    PubMed

    Takahashi, Chie; Watt, Simon J

    2017-05-01

    When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.

  1. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  2. Evidence of emotion-antecedent appraisal checks in electroencephalography and facial electromyography

    PubMed Central

    Scherer, Klaus R.; Schuller, Björn W.

    2018-01-01

    In the present study, we applied Machine Learning (ML) methods to identify psychobiological markers of cognitive processes involved in the process of emotion elicitation as postulated by the Component Process Model (CPM). In particular, we focused on the automatic detection of five appraisal checks—novelty, intrinsic pleasantness, goal conduciveness, control, and power—in electroencephalography (EEG) and facial electromyography (EMG) signals. We also evaluated the effects on classification accuracy of averaging the raw physiological signals over different numbers of trials, and whether the use of minimal sets of EEG channels localized over specific scalp regions of interest are sufficient to discriminate between appraisal checks. We demonstrated the effectiveness of our approach on two data sets obtained from previous studies. Our results show that novelty and power appraisal checks can be consistently detected in EEG signals above chance level (binary tasks). For novelty, the best classification performance in terms of accuracy was achieved using features extracted from the whole scalp, and by averaging across 20 individual trials in the same experimental condition (UAR = 83.5 ± 4.2; N = 25). For power, the best performance was obtained by using the signals from four pre-selected EEG channels averaged across all trials available for each participant (UAR = 70.6 ± 5.3; N = 24). Together, our results indicate that accurate classification can be achieved with a relatively small number of trials and channels, but that averaging across a larger number of individual trials is beneficial for the classification for both appraisal checks. We were not able to detect any evidence of the appraisal checks under study in the EMG data. The proposed methodology is a promising tool for the study of the psychophysiological mechanisms underlying emotional episodes, and their application to the development of computerized tools (e.g., Brain-Computer Interface) for the study of cognitive processes involved in emotions. PMID:29293572

  3. Evidence of emotion-antecedent appraisal checks in electroencephalography and facial electromyography.

    PubMed

    Coutinho, Eduardo; Gentsch, Kornelia; van Peer, Jacobien; Scherer, Klaus R; Schuller, Björn W

    2018-01-01

    In the present study, we applied Machine Learning (ML) methods to identify psychobiological markers of cognitive processes involved in the process of emotion elicitation as postulated by the Component Process Model (CPM). In particular, we focused on the automatic detection of five appraisal checks-novelty, intrinsic pleasantness, goal conduciveness, control, and power-in electroencephalography (EEG) and facial electromyography (EMG) signals. We also evaluated the effects on classification accuracy of averaging the raw physiological signals over different numbers of trials, and whether the use of minimal sets of EEG channels localized over specific scalp regions of interest are sufficient to discriminate between appraisal checks. We demonstrated the effectiveness of our approach on two data sets obtained from previous studies. Our results show that novelty and power appraisal checks can be consistently detected in EEG signals above chance level (binary tasks). For novelty, the best classification performance in terms of accuracy was achieved using features extracted from the whole scalp, and by averaging across 20 individual trials in the same experimental condition (UAR = 83.5 ± 4.2; N = 25). For power, the best performance was obtained by using the signals from four pre-selected EEG channels averaged across all trials available for each participant (UAR = 70.6 ± 5.3; N = 24). Together, our results indicate that accurate classification can be achieved with a relatively small number of trials and channels, but that averaging across a larger number of individual trials is beneficial for the classification for both appraisal checks. We were not able to detect any evidence of the appraisal checks under study in the EMG data. The proposed methodology is a promising tool for the study of the psychophysiological mechanisms underlying emotional episodes, and their application to the development of computerized tools (e.g., Brain-Computer Interface) for the study of cognitive processes involved in emotions.

  4. Holostrain system: a powerful tool for experimental mechanics

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-09-01

    A portable holographic interferometer that can be used to measure displacements and strains in all kinds of mechanical components and structures is described. The holostrain system captures images on a TV camera that detects interference patterns produced by laser illumination. The video signals are digitized. The digitized interferograms are processed by a fast processing system. The output of the system are the strains or the stresses of the observed mechanical component or structure.

  5. Development and biological applications of optical tweezers and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Xie, Chang'an

    Optical tweezers is a three-dimensional manipulation tool that employs a gradient force that originates from the single highly focused laser beam. Raman spectroscopy is a molecular analytical tool that can give a highly unique "fingerprint" for each substance by measuring the unique vibrations of its molecules. The combination of these two optical techniques offers a new tool for the manipulation and identification of single biological cells and microscopic particles. In this thesis, we designed and implemented a Laser-Tweezers-Raman-Spectroscopy (LTRS) system, also called the Raman-tweezers, for the simultaneous capture and analysis of both biological particles and non-biological particles. We show that microparticles can be conveniently captured at the focus of a laser beam and the Raman spectra of trapped particles can be acquired with high quality. The LTRS system overcomes the intrinsic Brownian motion and cell motility of microparticles in solution and provides a promising tool for in situ identifying suspicious agents. In order to increase the signal to noise ratio, several schemes were employed in LTRS system to reduce the blank noise and the fluorescence signal coming from analytes and the surrounding background. These techniques include near-infrared excitation, optical levitation, confocal microscopy, and frequency-shifted Raman difference. The LTRS system has been applied for the study in cell biology at the single cell level. With the built Raman-tweezers system, we studied the dynamic physiological processes of single living cells, including cell cycle, the transcription and translation of recombinant protein in transgenic yeast cells and the T cell activation. We also studied cell damage and associated biochemical processes in optical traps, UV radiations, and evaluated heating by near-infrared Raman spectroscopy. These studies show that the Raman-tweezers system is feasible to provide rapid and reliable diagnosis of cellular disorders and can be used as a valuable tool to study cellular processes within single living cells or intracellular organelles and may aid research in molecular and cellular biology.

  6. Identifying Image Manipulation Software from Image Features

    DTIC Science & Technology

    2015-03-26

    obp/ui/#iso:std:iso:ts:22028:-3:ed-2: v1:en. 24. Popescu, Alin and Hany Farid. “Exposing digital forgeries by detecting traces of resampling”. Signal...Processing, IEEE Transactions on, 53(2):758–767, 2005. 25. Popescu, Alin and Hany Farid. “Statistical tools for digital forensics”. Informa- tion

  7. Exploring Listeners' Real-Time Reactions to Regional Accents

    ERIC Educational Resources Information Center

    Watson, Kevin; Clark, Lynn

    2015-01-01

    Evaluative reactions to language stimuli are presumably dynamic events, constantly changing through time as the signal unfolds, yet the tools we usually use to capture these reactions provide us with only a snapshot of this process by recording reactions at a single point in time. This paper outlines and evaluates a new methodology which employs…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cashion, Avery Ted; Cieslewski, Grzegorz

    New generations of high-temperature (HT) sensors and electronics are enabling increased measurement speed and accuracy allowing collection of more accurate and relevant data by downhole tools. Unfortunately, this increased capability is often not realized due to the bottleneck in the uplink data transmission rates due to poor signal characteristics of HT wireline. The objective of this project is to enable the high transmission rate of raw data from downhole tools such as acoustic logging tools and seismic measurement devices to minimize the need for downhole signal processing. To achieve this objective, Sandia has undertaken the effort to develop an asymmetricmore » high-temperature (HT), highspeed data link system for downhole tools capable of operating at temperatures of 210°C while taking advantage of existing wireline transmission channels. Current data rates over HT single-conductor wireline are limited to approximately 200 kbps. The goal system will be capable of transmitting data from the tool to the surface (uplink) at rates of > 1Mbps over 5,000 feet of single-conductor wireline as well as automatically adapt the data rate to the longer wirelines by adapting modern telecommunications techniques to operate on high temperature electronics. The data rate from the surface to the tool (downlink) will be significantly smaller but sufficient for command and control functions. While 5,000 feet of cable is the benchmark for this effort, improvements apply to all lengths of cable.« less

  9. Novel texture-based descriptors for tool wear condition monitoring

    NASA Astrophysics Data System (ADS)

    Antić, Aco; Popović, Branislav; Krstanović, Lidija; Obradović, Ratko; Milošević, Mijodrag

    2018-01-01

    All state-of-the-art tool condition monitoring systems (TCM) in the tool wear recognition task, especially those that use vibration sensors, heavily depend on the choice of descriptors containing information about the tool wear state which are extracted from the particular sensor signals. All other post-processing techniques do not manage to increase the recognition precision if those descriptors are not discriminative enough. In this work, we propose a tool wear monitoring strategy which relies on the novel texture based descriptors. We consider the module of the Short Term Discrete Fourier Transform (STDFT) spectra obtained from the particular vibration sensors signal utterance as the 2D textured image. This is done by identifying the time scale of STDFT as the first dimension, and the frequency scale as the second dimension of the particular textured image. The obtained textured image is then divided into particular 2D texture patches, covering a part of the frequency range of interest. After applying the appropriate filter bank, 2D textons are extracted for each predefined frequency band. By averaging in time, we extract from the textons for each band of interest the information regarding the Probability Density Function (PDF) in the form of lower order moments, thus obtaining robust tool wear state descriptors. We validate the proposed features by the experiments conducted on the real TCM system, obtaining the high recognition accuracy.

  10. Intensity-based masking: A tool to improve functional connectivity results of resting-state fMRI.

    PubMed

    Peer, Michael; Abboud, Sami; Hertz, Uri; Amedi, Amir; Arzy, Shahar

    2016-07-01

    Seed-based functional connectivity (FC) of resting-state functional MRI data is a widely used methodology, enabling the identification of functional brain networks in health and disease. Based on signal correlations across the brain, FC measures are highly sensitive to noise. A somewhat neglected source of noise is the fMRI signal attenuation found in cortical regions in close vicinity to sinuses and air cavities, mainly in the orbitofrontal, anterior frontal and inferior temporal cortices. BOLD signal recorded at these regions suffers from dropout due to susceptibility artifacts, resulting in an attenuated signal with reduced signal-to-noise ratio in as many as 10% of cortical voxels. Nevertheless, signal attenuation is largely overlooked during FC analysis. Here we first demonstrate that signal attenuation can significantly influence FC measures by introducing false functional correlations and diminishing existing correlations between brain regions. We then propose a method for the detection and removal of the attenuated signal ("intensity-based masking") by fitting a Gaussian-based model to the signal intensity distribution and calculating an intensity threshold tailored per subject. Finally, we apply our method on real-world data, showing that it diminishes false correlations caused by signal dropout, and significantly improves the ability to detect functional networks in single subjects. Furthermore, we show that our method increases inter-subject similarity in FC, enabling reliable distinction of different functional networks. We propose to include the intensity-based masking method as a common practice in the pre-processing of seed-based functional connectivity analysis, and provide software tools for the computation of intensity-based masks on fMRI data. Hum Brain Mapp 37:2407-2418, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Reflectometric measurement of plasma imaging and applications

    NASA Astrophysics Data System (ADS)

    Mase, A.; Ito, N.; Oda, M.; Komada, Y.; Nagae, D.; Zhang, D.; Kogi, Y.; Tobimatsu, S.; Maruyama, T.; Shimazu, H.; Sakata, E.; Sakai, F.; Kuwahara, D.; Yoshinaga, T.; Tokuzawa, T.; Nagayama, Y.; Kawahata, K.; Yamaguchi, S.; Tsuji-Iio, S.; Domier, C. W.; Luhmann, N. C., Jr.; Park, H. K.; Yun, G.; Lee, W.; Padhi, S.; Kim, K. W.

    2012-01-01

    Progress in microwave and millimeter-wave technologies has made possible advanced diagnostics for application to various fields, such as, plasma diagnostics, radio astronomy, alien substance detection, airborne and spaceborne imaging radars called as synthetic aperture radars, living body measurements. Transmission, reflection, scattering, and radiation processes of electromagnetic waves are utilized as diagnostic tools. In this report we focus on the reflectometric measurements and applications to biological signals (vital signal detection and breast cancer detection) as well as plasma diagnostics, specifically by use of imaging technique and ultra-wideband radar technique.

  12. Picosecond imaging of signal propagation in integrated circuits

    NASA Astrophysics Data System (ADS)

    Frohmann, Sven; Dietz, Enrico; Dittrich, Helmar; Hübers, Heinz-Wilhelm

    2017-04-01

    Optical analysis of integrated circuits (IC) is a powerful tool for analyzing security functions that are implemented in an IC. We present a photon emission microscope for picosecond imaging of hot carrier luminescence in ICs in the near-infrared spectral range from 900 to 1700 nm. It allows for a semi-invasive signal tracking in fully operational ICs on the gate or transistor level with a timing precision of approximately 6 ps. The capabilities of the microscope are demonstrated by imaging the operation of two ICs made by 180 and 60 nm process technology.

  13. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  14. Freeze-drying process monitoring using a cold plasma ionization device.

    PubMed

    Mayeresse, Y; Veillon, R; Sibille, P H; Nomine, C

    2007-01-01

    A cold plasma ionization device has been designed to monitor freeze-drying processes in situ by monitoring lyophilization chamber moisture content. This plasma device, which consists of a probe that can be mounted directly on the lyophilization chamber, depends upon the ionization of nitrogen and water molecules using a radiofrequency generator and spectrometric signal collection. The study performed on this probe shows that it is steam sterilizable, simple to integrate, reproducible, and sensitive. The limitations include suitable positioning in the lyophilization chamber, calibration, and signal integration. Sensitivity was evaluated in relation to the quantity of vials and the probe positioning, and correlation with existing methods, such as microbalance, was established. These tests verified signal reproducibility through three freeze-drying cycles. Scaling-up studies demonstrated a similar product signature for the same product using pilot-scale and larger-scale equipment. On an industrial scale, the method efficiently monitored the freeze-drying cycle, but in a larger industrial freeze-dryer the signal was slightly modified. This was mainly due to the positioning of the plasma device, in relation to the vapor flow pathway, which is not necessarily homogeneous within the freeze-drying chamber. The plasma tool is a relevant method for monitoring freeze-drying processes and may in the future allow the verification of current thermodynamic freeze-drying models. This plasma technique may ultimately represent a process analytical technology (PAT) approach for the freeze-drying process.

  15. A review of channel selection algorithms for EEG signal processing

    NASA Astrophysics Data System (ADS)

    Alotaiby, Turky; El-Samie, Fathi E. Abd; Alshebeili, Saleh A.; Ahmad, Ishtiaq

    2015-12-01

    Digital processing of electroencephalography (EEG) signals has now been popularly used in a wide variety of applications such as seizure detection/prediction, motor imagery classification, mental task classification, emotion classification, sleep state classification, and drug effects diagnosis. With the large number of EEG channels acquired, it has become apparent that efficient channel selection algorithms are needed with varying importance from one application to another. The main purpose of the channel selection process is threefold: (i) to reduce the computational complexity of any processing task performed on EEG signals by selecting the relevant channels and hence extracting the features of major importance, (ii) to reduce the amount of overfitting that may arise due to the utilization of unnecessary channels, for the purpose of improving the performance, and (iii) to reduce the setup time in some applications. Signal processing tools such as time-domain analysis, power spectral estimation, and wavelet transform have been used for feature extraction and hence for channel selection in most of channel selection algorithms. In addition, different evaluation approaches such as filtering, wrapper, embedded, hybrid, and human-based techniques have been widely used for the evaluation of the selected subset of channels. In this paper, we survey the recent developments in the field of EEG channel selection methods along with their applications and classify these methods according to the evaluation approach.

  16. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  17. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  18. Real-Time Data Display

    NASA Technical Reports Server (NTRS)

    Pedings, Marc

    2007-01-01

    RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.

  19. Plant hormone signaling during development: insights from computational models.

    PubMed

    Oliva, Marina; Farcot, Etienne; Vernoux, Teva

    2013-02-01

    Recent years have seen an impressive increase in our knowledge of the topology of plant hormone signaling networks. The complexity of these topologies has motivated the development of models for several hormones to aid understanding of how signaling networks process hormonal inputs. Such work has generated essential insights into the mechanisms of hormone perception and of regulation of cellular responses such as transcription in response to hormones. In addition, modeling approaches have contributed significantly to exploring how spatio-temporal regulation of hormone signaling contributes to plant growth and patterning. New tools have also been developed to obtain quantitative information on hormone distribution during development and to test model predictions, opening the way for quantitative understanding of the developmental roles of hormones. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Computer assisted blast design and assessment tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less

  1. Application and Evaluation of Independent Component Analysis Methods to Generalized Seizure Disorder Activities Exhibited in the Brain.

    PubMed

    George, S Thomas; Balakrishnan, R; Johnson, J Stanly; Jayakumar, J

    2017-07-01

    EEG records the spontaneous electrical activity of the brain using multiple electrodes placed on the scalp, and it provides a wealth of information related to the functions of brain. Nevertheless, the signals from the electrodes cannot be directly applied to a diagnostic tool like brain mapping as they undergo a "mixing" process because of the volume conduction effect in the scalp. A pervasive problem in neuroscience is determining which regions of the brain are active, given voltage measurements at the scalp. Because of which, there has been a surge of interest among the biosignal processing community to investigate the process of mixing and unmixing to identify the underlying active sources. According to the assumptions of independent component analysis (ICA) algorithms, the resultant mixture obtained from the scalp can be closely approximated by a linear combination of the "actual" EEG signals emanating from the underlying sources of electrical activity in the brain. As a consequence, using these well-known ICA techniques in preprocessing of the EEG signals prior to clinical applications could result in development of diagnostic tool like quantitative EEG which in turn can assist the neurologists to gain noninvasive access to patient-specific cortical activity, which helps in treating neuropathologies like seizure disorders. The popular and proven ICA schemes mentioned in various literature and applications were selected (which includes Infomax, JADE, and SOBI) and applied on generalized seizure disorder samples using EEGLAB toolbox in MATLAB environment to see their usefulness in source separations; and they were validated by the expert neurologist for clinical relevance in terms of pathologies on brain functionalities. The performance of Infomax method was found to be superior when compared with other ICA schemes applied on EEG and it has been established based on the validations carried by expert neurologist for generalized seizure and its clinical correlation. The results are encouraging for furthering the studies in the direction of developing useful brain mapping tools using ICA methods.

  2. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  3. On-line tool breakage monitoring of vibration tapping using spindle motor current

    NASA Astrophysics Data System (ADS)

    Li, Guangjun; Lu, Huimin; Liu, Gang

    2008-10-01

    Input current of driving motor has been employed successfully as monitoring the cutting state in manufacturing processes for more than a decade. In vibration tapping, however, the method of on-line monitoring motor electric current has not been reported. In this paper, a tap failure prediction method is proposed to monitor the vibration tapping process using the electrical current signal of the spindle motor. The process of vibration tapping is firstly described. Then the relationship between the torque of vibration tapping and the electric current of motor is investigated by theoretic deducing and experimental measurement. According to those results, a monitoring method of tool's breakage is proposed through monitoring the ratio of the current amplitudes during adjacent vibration tapping periods. Finally, a low frequency vibration tapping system with motor current monitoring is built up using a servo motor B-106B and its driver CR06. The proposed method has been demonstrated with experiment data of vibration tapping in titanic alloys. The result of experiments shows that the method, which can avoid the tool breakage and giving a few error alarms when the threshold of amplitude ratio is 1.2 and there is at least 2 times overrun among 50 adjacent periods, is feasible for tool breakage monitoring in the process of vibration tapping small thread holes.

  4. An open source tool for automatic spatiotemporal assessment of calcium transients and local ‘signal-close-to-noise’ activity in calcium imaging data

    PubMed Central

    Martin, Corinna; Jablonka, Sibylle

    2018-01-01

    Local and spontaneous calcium signals play important roles in neurons and neuronal networks. Spontaneous or cell-autonomous calcium signals may be difficult to assess because they appear in an unpredictable spatiotemporal pattern and in very small neuronal loci of axons or dendrites. We developed an open source bioinformatics tool for an unbiased assessment of calcium signals in x,y-t imaging series. The tool bases its algorithm on a continuous wavelet transform-guided peak detection to identify calcium signal candidates. The highly sensitive calcium event definition is based on identification of peaks in 1D data through analysis of a 2D wavelet transform surface. For spatial analysis, the tool uses a grid to separate the x,y-image field in independently analyzed grid windows. A document containing a graphical summary of the data is automatically created and displays the loci of activity for a wide range of signal intensities. Furthermore, the number of activity events is summed up to create an estimated total activity value, which can be used to compare different experimental situations, such as calcium activity before or after an experimental treatment. All traces and data of active loci become documented. The tool can also compute the signal variance in a sliding window to visualize activity-dependent signal fluctuations. We applied the calcium signal detector to monitor activity states of cultured mouse neurons. Our data show that both the total activity value and the variance area created by a sliding window can distinguish experimental manipulations of neuronal activity states. Notably, the tool is powerful enough to compute local calcium events and ‘signal-close-to-noise’ activity in small loci of distal neurites of neurons, which remain during pharmacological blockade of neuronal activity with inhibitors such as tetrodotoxin, to block action potential firing, or inhibitors of ionotropic glutamate receptors. The tool can also offer information about local homeostatic calcium activity events in neurites. PMID:29601577

  5. Digital seismo-acoustic signal processing aboard a wireless sensor platform

    NASA Astrophysics Data System (ADS)

    Marcillo, O.; Johnson, J. B.; Lorincz, K.; Werner-Allen, G.; Welsh, M.

    2006-12-01

    We are developing a low power, low-cost wireless sensor array to conduct real-time signal processing of earthquakes at active volcanoes. The sensor array, which integrates data from both seismic and acoustic sensors, is based on Moteiv TMote Sky wireless sensor nodes (www.moteiv.com). The nodes feature a Texas Instruments MSP430 microcontroller, 48 Kbytes of program memory, 10 Kbytes of static RAM, 1 Mbyte of external flash memory, and a 2.4-GHz Chipcon CC2420 IEEE 802.15.4 radio. The TMote Sky is programmed in TinyOS. Basic signal processing occurs on an array of three peripheral sensor nodes. These nodes are tied into a dedicated GPS receiver node, which is focused on time synchronization, and a central communications node, which handles data integration and additional processing. The sensor nodes incorporate dual 12-bit digitizers sampling a seismic sensor and a pressure transducer at 100 samples per second. The wireless capabilities of the system allow flexible array geometry, with a maximum aperture of 200m. We have already developed the digital signal processing routines on board the Moteiv Tmote sensor nodes. The developed routines accomplish Real-time Seismic-Amplitude Measurement (RSAM), Seismic Spectral- Amplitude Measurement (SSAM), and a user-configured Short Term Averaging / Long Term Averaging (STA LTA ratio), which is used to calculate first arrivals. The processed data from individual nodes are transmitted back to a central node, where additional processing may be performed. Such processing will include back azimuth determination and other wave field analyses. Future on-board signal processing will focus on event characterization utilizing pattern recognition and spectral characterization. The processed data is intended as low bandwidth information which can be transmitted periodically and at low cost through satellite telemetry to a web server. The processing is limited by the computational capabilities (RAM, ROM) of the nodes. Nevertheless, we envision this product to be a useful tool for assessing the state of unrest at remote volcanoes.

  6. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  7. SeeGH--a software tool for visualization of whole genome array comparative genomic hybridization data.

    PubMed

    Chi, Bryan; DeLeeuw, Ronald J; Coe, Bradley P; MacAulay, Calum; Lam, Wan L

    2004-02-09

    Array comparative genomic hybridization (CGH) is a technique which detects copy number differences in DNA segments. Complete sequencing of the human genome and the development of an array representing a tiling set of tens of thousands of DNA segments spanning the entire human genome has made high resolution copy number analysis throughout the genome possible. Since array CGH provides signal ratio for each DNA segment, visualization would require the reassembly of individual data points into chromosome profiles. We have developed a visualization tool for displaying whole genome array CGH data in the context of chromosomal location. SeeGH is an application that translates spot signal ratio data from array CGH experiments to displays of high resolution chromosome profiles. Data is imported from a simple tab delimited text file obtained from standard microarray image analysis software. SeeGH processes the signal ratio data and graphically displays it in a conventional CGH karyotype diagram with the added features of magnification and DNA segment annotation. In this process, SeeGH imports the data into a database, calculates the average ratio and standard deviation for each replicate spot, and links them to chromosome regions for graphical display. Once the data is displayed, users have the option of hiding or flagging DNA segments based on user defined criteria, and retrieve annotation information such as clone name, NCBI sequence accession number, ratio, base pair position on the chromosome, and standard deviation. SeeGH represents a novel software tool used to view and analyze array CGH data. The software gives users the ability to view the data in an overall genomic view as well as magnify specific chromosomal regions facilitating the precise localization of genetic alterations. SeeGH is easily installed and runs on Microsoft Windows 2000 or later environments.

  8. Microphone Array

    NASA Astrophysics Data System (ADS)

    Bader, Rolf

    This chapter deals with microphone arrays. It is arranged according to the different methods available to proceed through the different problems and through the different mathematical methods. After discussing general properties of different array types, such as plane arrays, spherical arrays, or scanning arrays, it proceeds to the signal processing tools that are most used in speech processing. In the third section, backpropagating methods based on the Helmholtz-Kirchhoff integral are discussed, which result in spatial radiation patterns of vibrating bodies or air.

  9. Setting the standards for signal transduction research.

    PubMed

    Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo

    2011-02-15

    Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.

  10. Investigation on the generation characteristic of pressure pulse wave signal during the measurement-while-drilling process

    NASA Astrophysics Data System (ADS)

    Changqing, Zhao; Kai, Liu; Tong, Zhao; Takei, Masahiro; Weian, Ren

    2014-04-01

    The mud-pulse logging instrument is an advanced measurement-while-drilling (MWD) tool and widely used by the industry in the world. In order to improve the signal transmission rate, ensure the accurate transmission of information and address the issue of the weak signal on the ground of oil and gas wells, the signal generator should send out the strong mud-pulse signals with the maximum amplitude. With the rotary valve pulse generator as the study object, the three-dimensional Reynolds NS equations and standard k - ɛ turbulent model were used as a mathematical model. The speed and pressure coupling calculation was done by simple algorithms to get the amplitudes of different rates of flow and axial clearances. Tests were done to verify the characteristics of the pressure signals. The pressure signal was captured by the standpiece pressure monitoring system. The study showed that the axial clearances grew bigger as the pressure wave amplitude value decreased and caused the weakening of the pulse signal. As the rate of flow got larger, the pressure wave amplitude would increase and the signal would be enhanced.

  11. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce.

    PubMed

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications.

  12. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce

    PubMed Central

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D.; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S.

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications. PMID:25852536

  13. Raf Kinase Inhibitory Protein (RKIP) as a Metastasis Suppressor: Regulation of Signaling Networks in Cancer

    PubMed Central

    Yesilkanal, Ali E.; Rosner, Marsha R.

    2015-01-01

    Cancer is one of the deadliest diseases worldwide, accounting for about 8 million deaths a year. For solid tumors, cancer patients die as a result of the metastatic spread of the tumor to the rest of the body. Therefore, there is a clinical need for understanding the molecular and cellular basis of metastasis, identifying patients whose tumors are more likely to metastasize, and developing effective therapies against metastatic progression. Over the years, Raf kinase inhibitory protein (RKIP) has emerged as a natural suppressor of the metastatic process, constituting a tool for studying metastasis and its clinical outcomes. Here, we review RKIP’s role as a metastasis suppressor and the signaling networks and genes regulated by RKIP in metastatic, triple-negative breast cancer. We also highlight the clinical implications and power of building gene signatures based on RKIP-regulated signaling modules in identifying cancer patients that are at higher risk for metastases. Finally, we highlight the potential of RKIP as a tool for developing new therapeutic strategies in cancer treatment. PMID:25597354

  14. On the use of fractional order PK-PD models

    NASA Astrophysics Data System (ADS)

    Ionescu, Clara; Copot, Dana

    2017-01-01

    Quantifying and controlling depth of anesthesia is a challenging process due to lack of measurement technology for direct effects of drug supply into the body. Efforts are being made to develop new sensor techniques and new horizons are explored for modeling this intricate process. This paper introduces emerging tools available on the ‘engineering market’ imported from the area of fractional calculus. A novel interpretation of the classical drug-effect curve is given, enabling linear control. This enables broadening the horizon of signal processing and control techniques and suggests future research lines.

  15. NaviSE: superenhancer navigator integrating epigenomics signal algebra.

    PubMed

    Ascensión, Alex M; Arrospide-Elgarresta, Mikel; Izeta, Ander; Araúzo-Bravo, Marcos J

    2017-06-06

    Superenhancers are crucial structural genomic elements determining cell fate, and they are also involved in the determination of several diseases, such as cancer or neurodegeneration. Although there are pipelines which use independent pieces of software to predict the presence of superenhancers from genome-wide chromatin marks or DNA-interaction protein binding sites, there is not yet an integrated software tool that processes automatically algebra combinations of raw data sequencing into a comprehensive final annotated report of predicted superenhancers. We have developed NaviSE, a user-friendly streamlined tool which performs a fully-automated parallel processing of genome-wide epigenomics data from sequencing files into a final report, built with a comprehensive set of annotated files that are navigated through a graphic user interface dynamically generated by NaviSE. NaviSE also implements an 'epigenomics signal algebra' that allows the combination of multiple activation and repression epigenomics signals. NaviSE provides an interactive chromosomal landscaping of the locations of superenhancers, which can be navigated to obtain annotated information about superenhancer signal profile, associated genes, gene ontology enrichment analysis, motifs of transcription factor binding sites enriched in superenhancers, graphs of the metrics evaluating the superenhancers quality, protein-protein interaction networks and enriched metabolic pathways among other features. We have parallelised the most time-consuming tasks achieving a reduction up to 30% for a 15 CPUs machine. We have optimized the default parameters of NaviSE to facilitate its use. NaviSE allows different entry levels of data processing, from sra-fastq files to bed files; and unifies the processing of multiple replicates. NaviSE outperforms the more time-consuming processes required in a non-integrated pipeline. Alongside its high performance, NaviSE is able to provide biological insights, predicting cell type specific markers, such as SOX2 and ZIC3 in embryonic stem cells, CDK5R1 and REST in neurons and CD86 and TLR2 in monocytes. NaviSE is a user-friendly streamlined solution for superenhancer analysis, annotation and navigation, requiring only basic computer and next generation sequencing knowledge. NaviSE binaries and documentation are available at: https://sourceforge.net/projects/navise-superenhancer/ .

  16. An adaptive unsaturated bistable stochastic resonance method and its application in mechanical fault diagnosis

    NASA Astrophysics Data System (ADS)

    Qiao, Zijian; Lei, Yaguo; Lin, Jing; Jia, Feng

    2017-02-01

    In mechanical fault diagnosis, most traditional methods for signal processing attempt to suppress or cancel noise imbedded in vibration signals for extracting weak fault characteristics, whereas stochastic resonance (SR), as a potential tool for signal processing, is able to utilize the noise to enhance fault characteristics. The classical bistable SR (CBSR), as one of the most widely used SR methods, however, has the disadvantage of inherent output saturation. The output saturation not only reduces the output signal-to-noise ratio (SNR) but also limits the enhancement capability for fault characteristics. To overcome this shortcoming, a novel method is proposed to extract the fault characteristics, where a piecewise bistable potential model is established. Simulated signals are used to illustrate the effectiveness of the proposed method, and the results show that the method is able to extract weak fault characteristics and has good enhancement performance and anti-noise capability. Finally, the method is applied to fault diagnosis of bearings and planetary gearboxes, respectively. The diagnosis results demonstrate that the proposed method can obtain larger output SNR, higher spectrum peaks at fault characteristic frequencies and therefore larger recognizable degree than the CBSR method.

  17. Mathematical modeling of a radio-frequency path for IEEE 802.11ah based wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Tyshchenko, Igor; Cherepanov, Alexander; Dmitrii, Vakhnin; Popova, Mariia

    2017-09-01

    This article discusses the process of creating the mathematical model of a radio-frequency path for an IEEE 802.11ah based wireless sensor networks using M atLab Simulink CAD tools. In addition, it describes occurring perturbing effects and determining the presence of a useful signal in the received mixture.

  18. On the bandwidth of the plenoptic function.

    PubMed

    Do, Minh N; Marchand-Maillet, Davy; Vetterli, Martin

    2012-02-01

    The plenoptic function (POF) provides a powerful conceptual tool for describing a number of problems in image/video processing, vision, and graphics. For example, image-based rendering is shown as sampling and interpolation of the POF. In such applications, it is important to characterize the bandwidth of the POF. We study a simple but representative model of the scene where band-limited signals (e.g., texture images) are "painted" on smooth surfaces (e.g., of objects or walls). We show that, in general, the POF is not band limited unless the surfaces are flat. We then derive simple rules to estimate the essential bandwidth of the POF for this model. Our analysis reveals that, in addition to the maximum and minimum depths and the maximum frequency of painted signals, the bandwidth of the POF also depends on the maximum surface slope. With a unifying formalism based on multidimensional signal processing, we can verify several key results in POF processing, such as induced filtering in space and depth-corrected interpolation, and quantify the necessary sampling rates. © 2011 IEEE

  19. Passive coherent location system simulation and evaluation

    NASA Astrophysics Data System (ADS)

    Slezák, Libor; Kvasnička, Michael; Pelant, Martin; Vávra, Jiř; Plšek, Radek

    2006-02-01

    Passive Coherent Location (PCL) is going to be important and perspective system of passive location of non cooperative and stealth targets. It works with the sources of irradiation of opportunity. PCL is intended to be a part of mobile Air Command and Control System (ACCS) as a Deployable ACCS Component (DAC). The company ERA works on PCL system parameters verification program by complete PCL simulator development since the year 2003. The Czech DoD takes financial participation on this program. The moving targets scenario, the RCS calculation by method of moment, ground clutter scattering and signal processing method (the bottle neck of the PCL) are available up to now in simulator tool. The digital signal (DSP) processing algorithms are performed both on simulated data and on real data measured at NATO C3 Agency in their Haag experiment. The Institute of Information Theory and Automation of the Academy of Sciences of the Czech Republic takes part on the implementation of the DSP algorithms in FPGA. The paper describes the simulator and signal processing structure and results both on simulated and measured data.

  20. Quantitative Aspects of Single Molecule Microscopy

    PubMed Central

    Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally

    2015-01-01

    Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102

  1. Encephalolexianalyzer

    DOEpatents

    Altschuler, E.L.; Dowla, F.U.

    1998-11-24

    The encephalolexianalyzer uses digital signal processing techniques on electroencephalograph (EEG) brain waves to determine whether or not someone is thinking about moving, e.g., tapping their fingers, or, alternatively, whether someone is actually moving, e.g., tapping their fingers, or at rest, i.e., not moving and not thinking of moving. The mu waves measured by a pair of electrodes placed over the motor cortex are signal processed to determine the power spectrum. At rest, the peak value of the power spectrum in the 8-13 Hz range is high, while when moving or thinking of moving, the peak value of the power spectrum in the 8-13 Hz range is low. This measured change in signal power spectrum is used to produce a control signal. The encephalolexianalyzer can be used to communicate either directly using Morse code, or via a cursor controlling a remote control; the encephalolexianalyzer can also be used to control other devices. The encephalolexianalyzer will be of great benefit to people with various handicaps and disabilities, and also has enormous commercial potential, as well as being an invaluable tool for studying the brain. 14 figs.

  2. Encephalolexianalyzer

    DOEpatents

    Altschuler, Eric L.; Dowla, Farid U.

    1998-01-01

    The encephalolexianalyzer uses digital signal processing techniques on electroencephalograph (EEG) brain waves to determine whether or not someone is thinking about moving, e.g., tapping their fingers, or, alternatively, whether someone is actually moving, e.g., tapping their fingers, or at rest, i.e., not moving and not thinking of moving. The mu waves measured by a pair of electrodes placed over the motor cortex are signal processed to determine the power spectrum. At rest, the peak value of the power spectrum in the 8-13 Hz range is high, while when moving or thinking of moving, the peak value of the power spectrum in the 8-13 Hz range is low. This measured change in signal power spectrum is used to produce a control signal. The encephalolexianalyzer can be used to communicate either directly using Morse code, or via a cursor controlling a remote control; the encephalolexianalyzer can also be used to control other devices. The encephalolexianalyzer will be of great benefit to people with various handicaps and disabilities, and also has enormous commercial potential, as well as being an invaluable tool for studying the brain.

  3. An open-hardware platform for optogenetics and photobiology

    PubMed Central

    Gerhardt, Karl P.; Olson, Evan J.; Castillo-Hair, Sebastian M.; Hartsough, Lucas A.; Landry, Brian P.; Ekness, Felix; Yokoo, Rayka; Gomez, Eric J.; Ramakrishnan, Prabha; Suh, Junghae; Savage, David F.; Tabor, Jeffrey J.

    2016-01-01

    In optogenetics, researchers use light and genetically encoded photoreceptors to control biological processes with unmatched precision. However, outside of neuroscience, the impact of optogenetics has been limited by a lack of user-friendly, flexible, accessible hardware. Here, we engineer the Light Plate Apparatus (LPA), a device that can deliver two independent 310 to 1550 nm light signals to each well of a 24-well plate with intensity control over three orders of magnitude and millisecond resolution. Signals are programmed using an intuitive web tool named Iris. All components can be purchased for under $400 and the device can be assembled and calibrated by a non-expert in one day. We use the LPA to precisely control gene expression from blue, green, and red light responsive optogenetic tools in bacteria, yeast, and mammalian cells and simplify the entrainment of cyanobacterial circadian rhythm. The LPA dramatically reduces the entry barrier to optogenetics and photobiology experiments. PMID:27805047

  4. An open-hardware platform for optogenetics and photobiology

    DOE PAGES

    Gerhardt, Karl P.; Olson, Evan J.; Castillo-Hair, Sebastian M.; ...

    2016-11-02

    In optogenetics, researchers use light and genetically encoded photoreceptors to control biological processes with unmatched precision. However, outside of neuroscience, the impact of optogenetics has been limited by a lack of user-friendly, flexible, accessible hardware. Here, we engineer the Light Plate Apparatus (LPA), a device that can deliver two independent 310 to 1550 nm light signals to each well of a 24-well plate with intensity control over three orders of magnitude and millisecond resolution. Signals are programmed using an intuitive web tool named Iris. All components can be purchased for under $400 and the device can be assembled and calibratedmore » by a non-expert in one day. We use the LPA to precisely control gene expression from blue, green, and red light responsive optogenetic tools in bacteria, yeast, and mammalian cells and simplify the entrainment of cyanobacterial circadian rhythm. Lastly, the LPA dramatically reduces the entry barrier to optogenetics and photobiology experiments.« less

  5. An open-hardware platform for optogenetics and photobiology.

    PubMed

    Gerhardt, Karl P; Olson, Evan J; Castillo-Hair, Sebastian M; Hartsough, Lucas A; Landry, Brian P; Ekness, Felix; Yokoo, Rayka; Gomez, Eric J; Ramakrishnan, Prabha; Suh, Junghae; Savage, David F; Tabor, Jeffrey J

    2016-11-02

    In optogenetics, researchers use light and genetically encoded photoreceptors to control biological processes with unmatched precision. However, outside of neuroscience, the impact of optogenetics has been limited by a lack of user-friendly, flexible, accessible hardware. Here, we engineer the Light Plate Apparatus (LPA), a device that can deliver two independent 310 to 1550 nm light signals to each well of a 24-well plate with intensity control over three orders of magnitude and millisecond resolution. Signals are programmed using an intuitive web tool named Iris. All components can be purchased for under $400 and the device can be assembled and calibrated by a non-expert in one day. We use the LPA to precisely control gene expression from blue, green, and red light responsive optogenetic tools in bacteria, yeast, and mammalian cells and simplify the entrainment of cyanobacterial circadian rhythm. The LPA dramatically reduces the entry barrier to optogenetics and photobiology experiments.

  6. An open-hardware platform for optogenetics and photobiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhardt, Karl P.; Olson, Evan J.; Castillo-Hair, Sebastian M.

    In optogenetics, researchers use light and genetically encoded photoreceptors to control biological processes with unmatched precision. However, outside of neuroscience, the impact of optogenetics has been limited by a lack of user-friendly, flexible, accessible hardware. Here, we engineer the Light Plate Apparatus (LPA), a device that can deliver two independent 310 to 1550 nm light signals to each well of a 24-well plate with intensity control over three orders of magnitude and millisecond resolution. Signals are programmed using an intuitive web tool named Iris. All components can be purchased for under $400 and the device can be assembled and calibratedmore » by a non-expert in one day. We use the LPA to precisely control gene expression from blue, green, and red light responsive optogenetic tools in bacteria, yeast, and mammalian cells and simplify the entrainment of cyanobacterial circadian rhythm. Lastly, the LPA dramatically reduces the entry barrier to optogenetics and photobiology experiments.« less

  7. ``Seeing'' electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    NASA Astrophysics Data System (ADS)

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; Vanmeter, John

    2010-11-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS-EEG ICA pairs was highly significant (p < 10-8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust ``optical N200'' at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject's reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components ``reflect'' electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes.

  8. “Seeing” electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    PubMed Central

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; VanMeter, John

    2010-01-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS–EEG ICA pairs was highly significant (p < 10−8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust “optical N200” at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject’s reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components “reflect” electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes. PMID:21198150

  9. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  10. A graph signal filtering-based approach for detection of different edge types on airborne lidar data

    NASA Astrophysics Data System (ADS)

    Bayram, Eda; Vural, Elif; Alatan, Aydin

    2017-10-01

    Airborne Laser Scanning is a well-known remote sensing technology, which provides a dense and highly accurate, yet unorganized point cloud of earth surface. During the last decade, extracting information from the data generated by airborne LiDAR systems has been addressed by many studies in geo-spatial analysis and urban monitoring applications. However, the processing of LiDAR point clouds is challenging due to their irregular structure and 3D geometry. In this study, we propose a novel framework for the detection of the boundaries of an object or scene captured by LiDAR. Our approach is motivated by edge detection techniques in vision research and it is established on graph signal filtering which is an exciting and promising field of signal processing for irregular data types. Due to the convenient applicability of graph signal processing tools on unstructured point clouds, we achieve the detection of the edge points directly on 3D data by using a graph representation that is constructed exclusively to answer the requirements of the application. Moreover, considering the elevation data as the (graph) signal, we leverage aerial characteristic of the airborne LiDAR data. The proposed method can be employed both for discovering the jump edges on a segmentation problem and for exploring the crease edges on a LiDAR object on a reconstruction/modeling problem, by only adjusting the filter characteristics.

  11. Identification and classification of failure modes in laminated composites by using a multivariate statistical analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Baccar, D.; Söffker, D.

    2017-11-01

    Acoustic Emission (AE) is a suitable method to monitor the health of composite structures in real-time. However, AE-based failure mode identification and classification are still complex to apply due to the fact that AE waves are generally released simultaneously from all AE-emitting damage sources. Hence, the use of advanced signal processing techniques in combination with pattern recognition approaches is required. In this paper, AE signals generated from laminated carbon fiber reinforced polymer (CFRP) subjected to indentation test are examined and analyzed. A new pattern recognition approach involving a number of processing steps able to be implemented in real-time is developed. Unlike common classification approaches, here only CWT coefficients are extracted as relevant features. Firstly, Continuous Wavelet Transform (CWT) is applied to the AE signals. Furthermore, dimensionality reduction process using Principal Component Analysis (PCA) is carried out on the coefficient matrices. The PCA-based feature distribution is analyzed using Kernel Density Estimation (KDE) allowing the determination of a specific pattern for each fault-specific AE signal. Moreover, waveform and frequency content of AE signals are in depth examined and compared with fundamental assumptions reported in this field. A correlation between the identified patterns and failure modes is achieved. The introduced method improves the damage classification and can be used as a non-destructive evaluation tool.

  12. Towards Automatic Classification of Exoplanet-Transit-Like Signals: A Case Study on Kepler Mission Data

    NASA Astrophysics Data System (ADS)

    Valizadegan, Hamed; Martin, Rodney; McCauliff, Sean D.; Jenkins, Jon Michael; Catanzarite, Joseph; Oza, Nikunj C.

    2015-08-01

    Building new catalogues of planetary candidates, astrophysical false alarms, and non-transiting phenomena is a challenging task that currently requires a reviewing team of astrophysicists and astronomers. These scientists need to examine more than 100 diagnostic metrics and associated graphics for each candidate exoplanet-transit-like signal to classify it into one of the three classes. Considering that the NASA Explorer Program's TESS mission and ESA's PLATO mission survey even a larger area of space, the classification of their transit-like signals is more time-consuming for human agents and a bottleneck to successfully construct the new catalogues in a timely manner. This encourages building automatic classification tools that can quickly and reliably classify the new signal data from these missions. The standard tool for building automatic classification systems is the supervised machine learning that requires a large set of highly accurate labeled examples in order to build an effective classifier. This requirement cannot be easily met for classifying transit-like signals because not only are existing labeled signals very limited, but also the current labels may not be reliable (because the labeling process is a subjective task). Our experiments with using different supervised classifiers to categorize transit-like signals verifies that the labeled signals are not rich enough to provide the classifier with enough power to generalize well beyond the observed cases (e.g. to unseen or test signals). That motivated us to utilize a new category of learning techniques, so-called semi-supervised learning, that combines the label information from the costly labeled signals, and distribution information from the cheaply available unlabeled signals in order to construct more effective classifiers. Our study on the Kepler Mission data shows that semi-supervised learning can significantly improve the result of multiple base classifiers (e.g. Support Vector Machines, AdaBoost, and Decision Tree) and is a good technique for automatic classification of exoplanet-transit-like signal.

  13. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  14. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  15. A feasibility study on age-related factors of wrist pulse using principal component analysis.

    PubMed

    Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim

    2016-08-01

    Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.

  16. Genomic signal analysis of pathogen variability

    NASA Astrophysics Data System (ADS)

    Cristea, Paul Dan

    2006-02-01

    The paper presents results in the study of pathogen variability by using genomic signals. The conversion of symbolic nucleotide sequences into digital signals offers the possibility to apply signal processing methods to the analysis of genomic data. The method is particularly well suited to characterize small size genomic sequences, such as those found in viruses and bacteria, being a promising tool in tracking the variability of pathogens, especially in the context of developing drug resistance. The paper is based on data downloaded from GenBank [32], and comprises results on the variability of the eight segments of the influenza type A, subtype H5N1, virus genome, and of the Hemagglutinin (HA) gene, for the H1, H2, H3, H4, H5 and H16 types. Data from human and avian virus isolates are used.

  17. Embedding Dimension Selection for Adaptive Singular Spectrum Analysis of EEG Signal.

    PubMed

    Xu, Shanzhi; Hu, Hai; Ji, Linhong; Wang, Peng

    2018-02-26

    The recorded electroencephalography (EEG) signal is often contaminated with different kinds of artifacts and noise. Singular spectrum analysis (SSA) is a powerful tool for extracting the brain rhythm from a noisy EEG signal. By analyzing the frequency characteristics of the reconstructed component (RC) and the change rate in the trace of the Toeplitz matrix, it is demonstrated that the embedding dimension is related to the frequency bandwidth of each reconstructed component, in consistence with the component mixing in the singular value decomposition step. A method for selecting the embedding dimension is thereby proposed and verified by simulated EEG signal based on the Markov Process Amplitude (MPA) EEG Model. Real EEG signal is also collected from the experimental subjects under both eyes-open and eyes-closed conditions. The experimental results show that based on the embedding dimension selection method, the alpha rhythm can be extracted from the real EEG signal by the adaptive SSA, which can be effectively utilized to distinguish between the eyes-open and eyes-closed states.

  18. Force-Mediating Magnetic Nanoparticles to Engineer Neuronal Cell Function

    PubMed Central

    Gahl, Trevor J.; Kunze, Anja

    2018-01-01

    Cellular processes like membrane deformation, cell migration, and transport of organelles are sensitive to mechanical forces. Technically, these cellular processes can be manipulated through operating forces at a spatial precision in the range of nanometers up to a few micrometers through chaperoning force-mediating nanoparticles in electrical, magnetic, or optical field gradients. But which force-mediating tool is more suitable to manipulate cell migration, and which, to manipulate cell signaling? We review here the differences in forces sensation to control and engineer cellular processes inside and outside the cell, with a special focus on neuronal cells. In addition, we discuss technical details and limitations of different force-mediating approaches and highlight recent advancements of nanomagnetics in cell organization, communication, signaling, and intracellular trafficking. Finally, we give suggestions about how force-mediating nanoparticles can be used to our advantage in next-generation neurotherapeutic devices. PMID:29867315

  19. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  20. Assessment of Rho GTPase signaling during neurite outgrowth.

    PubMed

    Feltrin, Daniel; Pertz, Olivier

    2012-01-01

    Rho GTPases are key regulators of the cytoskeleton during the process of neurite outgrowth. Based on overexpression of dominant-positive and negative Rho GTPase constructs, the classic view is that Rac1 and Cdc42 are important for neurite elongation whereas RhoA regulates neurite retraction in response to collapsing agents. However, recent work has suggested a much finer control of spatiotemporal Rho GTPase signaling in this process. Understanding this complexity level necessitates a panel of more sensitive tools than previously used. Here, we discuss a novel assay that enables the biochemical fractionation of the neurite from the soma of differentiating N1E-115 neuronal-like cells. This allows for spatiotemporal characterization of a large number of protein components, interactions, and post-translational modifications using classic biochemical and also proteomics approaches. We also provide protocols for siRNA-mediated knockdown of genes and sensitive assays that allow quantitative analysis of the neurite outgrowth process.

  1. Force-Mediating Magnetic Nanoparticles to Engineer Neuronal Cell Function.

    PubMed

    Gahl, Trevor J; Kunze, Anja

    2018-01-01

    Cellular processes like membrane deformation, cell migration, and transport of organelles are sensitive to mechanical forces. Technically, these cellular processes can be manipulated through operating forces at a spatial precision in the range of nanometers up to a few micrometers through chaperoning force-mediating nanoparticles in electrical, magnetic, or optical field gradients. But which force-mediating tool is more suitable to manipulate cell migration, and which, to manipulate cell signaling? We review here the differences in forces sensation to control and engineer cellular processes inside and outside the cell, with a special focus on neuronal cells. In addition, we discuss technical details and limitations of different force-mediating approaches and highlight recent advancements of nanomagnetics in cell organization, communication, signaling, and intracellular trafficking. Finally, we give suggestions about how force-mediating nanoparticles can be used to our advantage in next-generation neurotherapeutic devices.

  2. Application of wavelet filtering and Barker-coded pulse compression hybrid method to air-coupled ultrasonic testing

    NASA Astrophysics Data System (ADS)

    Zhou, Zhenggan; Ma, Baoquan; Jiang, Jingtao; Yu, Guang; Liu, Kui; Zhang, Dongmei; Liu, Weiping

    2014-10-01

    Air-coupled ultrasonic testing (ACUT) technique has been viewed as a viable solution in defect detection of advanced composites used in aerospace and aviation industries. However, the giant mismatch of acoustic impedance in air-solid interface makes the transmission efficiency of ultrasound low, and leads to poor signal-to-noise (SNR) ratio of received signal. The utilisation of signal-processing techniques in non-destructive testing is highly appreciated. This paper presents a wavelet filtering and phase-coded pulse compression hybrid method to improve the SNR and output power of received signal. The wavelet transform is utilised to filter insignificant components from noisy ultrasonic signal, and pulse compression process is used to improve the power of correlated signal based on cross-correction algorithm. For the purpose of reasonable parameter selection, different families of wavelets (Daubechies, Symlet and Coiflet) and decomposition level in discrete wavelet transform are analysed, different Barker codes (5-13 bits) are also analysed to acquire higher main-to-side lobe ratio. The performance of the hybrid method was verified in a honeycomb composite sample. Experimental results demonstrated that the proposed method is very efficient in improving the SNR and signal strength. The applicability of the proposed method seems to be a very promising tool to evaluate the integrity of high ultrasound attenuation composite materials using the ACUT.

  3. Application of the wavelet packet transform to vibration signals for surface roughness monitoring in CNC turning operations

    NASA Astrophysics Data System (ADS)

    García Plaza, E.; Núñez López, P. J.

    2018-01-01

    The wavelet packet transform method decomposes a time signal into several independent time-frequency signals called packets. This enables the temporary location of transient events occurring during the monitoring of the cutting processes, which is advantageous in monitoring condition and fault diagnosis. This paper proposes the monitoring of surface roughness using a single low cost sensor that is easily implemented in numerical control machine tools in order to make on-line decisions on workpiece surface finish quality. Packet feature extraction in vibration signals was applied to correlate the sensor signals to measured surface roughness. For the successful application of the WPT method, mother wavelets, packet decomposition level, and appropriate packet selection methods should be considered, but are poorly understood aspects in the literature. In this novel contribution, forty mother wavelets, optimal decomposition level, and packet reduction methods were analysed, as well as identifying the effective frequency range providing the best packet feature extraction for monitoring surface finish. The results show that mother wavelet biorthogonal 4.4 in decomposition level L3 with the fusion of the orthogonal vibration components (ax + ay + az) were the best option in the vibration signal and surface roughness correlation. The best packets were found in the medium-high frequency DDA (6250-9375 Hz) and high frequency ADA (9375-12500 Hz) ranges, and the feed acceleration component ay was the primary source of information. The packet reduction methods forfeited packets with relevant features to the signal, leading to poor results for the prediction of surface roughness. WPT is a robust vibration signal processing method for the monitoring of surface roughness using a single sensor without other information sources, satisfactory results were obtained in comparison to other processing methods with a low computational cost.

  4. Spacewire on Earth orbiting scatterometers

    NASA Technical Reports Server (NTRS)

    Bachmann, Alex; Lang, Minh; Lux, James; Steffke, Richard

    2002-01-01

    The need for a high speed, reliable and easy to implement communication link has led to the development of a space flight oriented version of IEEE 1355 called SpaceWire. SpaceWire is based on high-speed (200 Mbps) serial point-to-point links using Low Voltage Differential Signaling (LVDS). SpaceWIre has provisions for routing messages between a large network of processors, using wormhole routing for low overhead and latency. {additionally, there are available space qualified hybrids, which provide the Link layer to the user's bus}. A test bed of multiple digital signal processor breadboards, demonstrating the ability to meet signal processing requirements for an orbiting scatterometer has been implemented using three Astrium MCM-DSPs, each breadboard consists of a Multi Chip Module (MCM) that combines a space qualified Digital Signal Processor and peripherals, including IEEE-1355 links. With the addition of appropriate physical layer interfaces and software on the DSP, the SpaceWire link is used to communicate between processors on the test bed, e.g. sending timing references, commands, status, and science data among the processors. Results are presented on development issues surrounding the use of SpaceWire in this environment, from physical layer implementation (cables, connectors, LVDS drivers) to diagnostic tools, driver firmware, and development methodology. The tools, methods, and hardware, software challenges and preliminary performance are investigated and discussed.

  5. SiGe BiCMOS manufacturing platform for mmWave applications

    NASA Astrophysics Data System (ADS)

    Kar-Roy, Arjun; Howard, David; Preisler, Edward; Racanelli, Marco; Chaudhry, Samir; Blaschke, Volker

    2010-10-01

    TowerJazz offers high volume manufacturable commercial SiGe BiCMOS technology platforms to address the mmWave market. In this paper, first, the SiGe BiCMOS process technology platforms such as SBC18 and SBC13 are described. These manufacturing platforms integrate 200 GHz fT/fMAX SiGe NPN with deep trench isolation into 0.18μm and 0.13μm node CMOS processes along with high density 5.6fF/μm2 stacked MIM capacitors, high value polysilicon resistors, high-Q metal resistors, lateral PNP transistors, and triple well isolation using deep n-well for mixed-signal integration, and, multiple varactors and compact high-Q inductors for RF needs. Second, design enablement tools that maximize performance and lowers costs and time to market such as scalable PSP and HICUM models, statistical and Xsigma models, reliability modeling tools, process control model tools, inductor toolbox and transmission line models are described. Finally, demonstrations in silicon for mmWave applications in the areas of optical networking, mobile broadband, phased array radar, collision avoidance radar and W-band imaging are listed.

  6. Probing biological redox chemistry with large amplitude Fourier transformed ac voltammetry

    PubMed Central

    Adamson, Hope

    2017-01-01

    Biological electron-exchange reactions are fundamental to life on earth. Redox reactions underpin respiration, photosynthesis, molecular biosynthesis, cell signalling and protein folding. Chemical, biomedical and future energy technology developments are also inspired by these natural electron transfer processes. Further developments in techniques and data analysis are required to gain a deeper understanding of the redox biochemistry processes that power Nature. This review outlines the new insights gained from developing Fourier transformed ac voltammetry as a tool for protein film electrochemistry. PMID:28804798

  7. Fluorescent Probes and Selective Inhibitors for Biological Studies of Hydrogen Sulfide- and Polysulfide-Mediated Signaling.

    PubMed

    Takano, Yoko; Echizen, Honami; Hanaoka, Kenjiro

    2017-10-01

    Hydrogen sulfide (H 2 S) plays roles in many physiological processes, including relaxation of vascular smooth muscles, mediation of neurotransmission, inhibition of insulin signaling, and regulation of inflammation. Also, hydropersulfide (R-S-SH) and polysulfide (-S-S n -S-) have recently been identified as reactive sulfur species (RSS) that regulate the bioactivities of multiple proteins via S-sulfhydration of cysteine residues (protein Cys-SSH) and show cytoprotection. Chemical tools such as fluorescent probes and selective inhibitors are needed to establish in detail the physiological roles of H 2 S and polysulfide. Recent Advances: Although many fluorescent probes for H 2 S are available, fluorescent probes for hydropersulfide and polysulfide have only recently been developed and used to detect these sulfur species in living cells. In this review, we summarize recent progress in developing chemical tools for the study of H 2 S, hydropersulfide, and polysulfide, covering fluorescent probes based on various design strategies and selective inhibitors of H 2 S- and polysulfide-producing enzymes (cystathionine γ-lyase, cystathionine β-synthase, and 3-mercaptopyruvate sulfurtransferase), and we summarize their applications in biological studies. Despite recent progress, the precise biological functions of H 2 S, hydropersulfide, and polysulfide remain to be fully established. Fluorescent probes and selective inhibitors are effective chemical tools to study the physiological roles of these sulfur molecules in living cells and tissues. Therefore, further development of a broad range of practical fluorescent probes and selective inhibitors as tools for studies of RSS biology is currently attracting great interest. Antioxid. Redox Signal. 27, 669-683.

  8. Rapid Structured Volume Grid Smoothing and Adaption Technique

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2006-01-01

    A rapid, structured volume grid smoothing and adaption technique, based on signal processing methods, was developed and applied to the Shuttle Orbiter at hypervelocity flight conditions in support of the Columbia Accident Investigation. Because of the fast pace of the investigation, computational aerothermodynamicists, applying hypersonic viscous flow solving computational fluid dynamic (CFD) codes, refined and enhanced a grid for an undamaged baseline vehicle to assess a variety of damage scenarios. Of the many methods available to modify a structured grid, most are time-consuming and require significant user interaction. By casting the grid data into different coordinate systems, specifically two computational coordinates with arclength as the third coordinate, signal processing methods are used for filtering the data [Taubin, CG v/29 1995]. Using a reverse transformation, the processed data are used to smooth the Cartesian coordinates of the structured grids. By coupling the signal processing method with existing grid operations within the Volume Grid Manipulator tool, problems related to grid smoothing are solved efficiently and with minimal user interaction. Examples of these smoothing operations are illustrated for reductions in grid stretching and volume grid adaptation. In each of these examples, other techniques existed at the time of the Columbia accident, but the incorporation of signal processing techniques reduced the time to perform the corrections by nearly 60%. This reduction in time to perform the corrections therefore enabled the assessment of approximately twice the number of damage scenarios than previously possible during the allocated investigation time.

  9. Rapid Structured Volume Grid Smoothing and Adaption Technique

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2004-01-01

    A rapid, structured volume grid smoothing and adaption technique, based on signal processing methods, was developed and applied to the Shuttle Orbiter at hypervelocity flight conditions in support of the Columbia Accident Investigation. Because of the fast pace of the investigation, computational aerothermodynamicists, applying hypersonic viscous flow solving computational fluid dynamic (CFD) codes, refined and enhanced a grid for an undamaged baseline vehicle to assess a variety of damage scenarios. Of the many methods available to modify a structured grid, most are time-consuming and require significant user interaction. By casting the grid data into different coordinate systems, specifically two computational coordinates with arclength as the third coordinate, signal processing methods are used for filtering the data [Taubin, CG v/29 1995]. Using a reverse transformation, the processed data are used to smooth the Cartesian coordinates of the structured grids. By coupling the signal processing method with existing grid operations within the Volume Grid Manipulator tool, problems related to grid smoothing are solved efficiently and with minimal user interaction. Examples of these smoothing operations are illustrated for reduction in grid stretching and volume grid adaptation. In each of these examples, other techniques existed at the time of the Columbia accident, but the incorporation of signal processing techniques reduced the time to perform the corrections by nearly 60%. This reduction in time to perform the corrections therefore enabled the assessment of approximately twice the number of damage scenarios than previously possible during the allocated investigation time.

  10. Tannin fingerprinting in vegetable tanned leather by solid state NMR spectroscopy and comparison with leathers tanned by other processes.

    PubMed

    Romer, Frederik H; Underwood, Andrew P; Senekal, Nadine D; Bonnet, Susan L; Duer, Melinda J; Reid, David G; van der Westhuizen, Jan H

    2011-01-28

    Solid state ¹³C-NMR spectra of pure tannin powders from four different sources--mimosa, quebracho, chestnut and tara--are readily distinguishable from each other, both in pure commercial powder form, and in leather which they have been used to tan. Groups of signals indicative of the source, and type (condensed vs. hydrolyzable) of tannin used in the manufacture are well resolved in the spectra of the finished leathers. These fingerprints are compared with those arising from leathers tanned with other common tanning agents. Paramagnetic chromium (III) tanning causes widespread but selective disappearance of signals from the spectrum of leather collagen, including resonances from acidic aspartyl and glutamyl residues, likely bound to Cr (III) structures. Aluminium (III) and glutaraldehyde tanning both cause considerable leather collagen signal sharpening suggesting some increase in molecular structural ordering. The ²⁷Al-NMR signal from the former material is consistent with an octahedral coordination by oxygen ligands. Solid state NMR thus provides easily recognisable reagent specific spectral fingerprints of the products of vegetable and some other common tanning processes. Because spectra are related to molecular properties, NMR is potentially a powerful tool in leather process enhancement and quality or provenance assurance.

  11. A transfer of technology from engineering: use of ROC curves from signal detection theory to investigate information processing in the brain during sensory difference testing.

    PubMed

    Wichchukit, Sukanya; O'Mahony, Michael

    2010-01-01

    This article reviews a beneficial effect of technology transfer from Electrical Engineering to Food Sensory Science. Specifically, it reviews the recent adoption in Food Sensory Science of the receiver operating characteristic (ROC) curve, a tool that is incorporated in the theory of signal detection. Its use allows the information processing that takes place in the brain during sensory difference testing to be studied and understood. The review deals with how Signal Detection Theory, also called Thurstonian modeling, led to the adoption of a more sophisticated way of analyzing the data from sensory difference tests, by introducing the signal-to-noise ratio, d', as a fundamental measure of perceived small sensory differences. Generally, the method of computation of d' is a simple matter for some of the better known difference tests like the triangle, duo-trio and 2-AFC. However, there are occasions when these tests are not appropriate and other tests like the same-different and the A Not-A test are more suitable. Yet, for these, it is necessary to understand how the brain processes information during the test before d' can be computed. It is for this task that the ROC curve has a particular use. © 2010 Institute of Food Technologists®

  12. Sensor, signal, and image informatics - state of the art and current topics.

    PubMed

    Lehmann, T M; Aach, T; Witte, H

    2006-01-01

    The number of articles published annually in the fields of biomedical signal and image acquisition and processing is increasing. Based on selected examples, this survey aims at comprehensively demonstrating the recent trends and developments. Four articles are selected for biomedical data acquisition covering topics such as dose saving in CT, C-arm X-ray imaging systems for volume imaging, and the replacement of dose-intensive CT-based diagnostic with harmonic ultrasound imaging. Regarding biomedical signal analysis (BSA), the four selected articles discuss the equivalence of different time-frequency approaches for signal analysis, an application to Cochlea implants, where time-frequency analysis is applied for controlling the replacement system, recent trends for fusion of different modalities, and the role of BSA as part of a brain machine interfaces. To cover the broad spectrum of publications in the field of biomedical image processing, six papers are focused. Important topics are content-based image retrieval in medical applications, automatic classification of tongue photographs from traditional Chinese medicine, brain perfusion analysis in single photon emission computed tomography (SPECT), model-based visualization of vascular trees, and virtual surgery, where enhanced visualization and haptic feedback techniques are combined with a sphere-filled model of the organ. The selected papers emphasize the five fields forming the chain of biomedical data processing: (1) data acquisition, (2) data reconstruction and pre-processing, (3) data handling, (4) data analysis, and (5) data visualization. Fields 1 and 2 form the sensor informatics, while fields 2 to 5 form signal or image informatics with respect to the nature of the data considered. Biomedical data acquisition and pre-processing, as well as data handling, analysis and visualization aims at providing reliable tools for decision support that improve the quality of health care. Comprehensive evaluation of the processing methods and their reliable integration in routine applications are future challenges in the field of sensor, signal and image informatics.

  13. A new similarity index for nonlinear signal analysis based on local extrema patterns

    NASA Astrophysics Data System (ADS)

    Niknazar, Hamid; Motie Nasrabadi, Ali; Shamsollahi, Mohammad Bagher

    2018-02-01

    Common similarity measures of time domain signals such as cross-correlation and Symbolic Aggregate approximation (SAX) are not appropriate for nonlinear signal analysis. This is because of the high sensitivity of nonlinear systems to initial points. Therefore, a similarity measure for nonlinear signal analysis must be invariant to initial points and quantify the similarity by considering the main dynamics of signals. The statistical behavior of local extrema (SBLE) method was previously proposed to address this problem. The SBLE similarity index uses quantized amplitudes of local extrema to quantify the dynamical similarity of signals by considering patterns of sequential local extrema. By adding time information of local extrema as well as fuzzifying quantized values, this work proposes a new similarity index for nonlinear and long-term signal analysis, which extends the SBLE method. These new features provide more information about signals and reduce noise sensitivity by fuzzifying them. A number of practical tests were performed to demonstrate the ability of the method in nonlinear signal clustering and classification on synthetic data. In addition, epileptic seizure detection based on electroencephalography (EEG) signal processing was done by the proposed similarity to feature the potentials of the method as a real-world application tool.

  14. Progress on automated data analysis algorithms for ultrasonic inspection of composites

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2015-03-01

    Progress is presented on the development and demonstration of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. New algorithms have been implemented to reliably identify indications in time-of-flight images near the front and back walls of composite panels. Adaptive call criteria have also been applied to address sensitivity to variation in backwall signal level, panel thickness variation, and internal signal noise. ADA processing results are presented for a variety of test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions. Software tools have been developed to support both ADA algorithm design and certification, producing a statistical evaluation of indication results and false calls using a matching process with predefined truth tables. Parametric studies were performed to evaluate detection and false call results with respect to varying algorithm settings.

  15. Discrete Walsh Hadamard transform based visible watermarking technique for digital color images

    NASA Astrophysics Data System (ADS)

    Santhi, V.; Thangavelu, Arunkumar

    2011-10-01

    As the size of the Internet is growing enormously the illegal manipulation of digital multimedia data become very easy with the advancement in technology tools. In order to protect those multimedia data from unauthorized access the digital watermarking system is used. In this paper a new Discrete walsh Hadamard Transform based visible watermarking system is proposed. As the watermark is embedded in transform domain, the system is robust to many signal processing attacks. Moreover in this proposed method the watermark is embedded in tiling manner in all the range of frequencies to make it robust to compression and cropping attack. The robustness of the algorithm is tested against noise addition, cropping, compression, Histogram equalization and resizing attacks. The experimental results show that the algorithm is robust to common signal processing attacks and the observed peak signal to noise ratio (PSNR) of watermarked image is varying from 20 to 30 db depends on the size of the watermark.

  16. Automated Design of Board and MCM Level Digital Systems.

    DTIC Science & Technology

    1997-10-01

    Partitioning for Multicomponent Synthesis 159 Appendix K: Resource Constrained RTL Partitioning for Synthesis of Multi- FPGA Designs 169 Appendix L...digital signal processing) ar- chitectures. These target architectures, illustrated in Figure 1, can contain application-specific ASICS, FPGAs ...synthesis tools for ASIC, FPGA and MCM synthesis (Figure 8). Multicomponent Partitioning Engine The par- titioning engine is a hierarchical partitioning

  17. Biophysical modelling of intra-ring variations in tracheid features and wood density of Pinus pinaster trees exposed to seasonal droughts

    Treesearch

    Sarah Wilkinson; Jerome Ogee; Jean-Christophe Domec; Mark Rayment; Lisa Wingate

    2015-01-01

    Process-based models that link seasonally varying environmental signals to morphological features within tree rings are essential tools to predict tree growth response and commercially important wood quality traits under future climate scenarios. This study evaluated model portrayal of radial growth and wood anatomy observations within a mature maritime pine (Pinus...

  18. Statistical Techniques for Signal Processing

    DTIC Science & Technology

    1993-01-12

    functions and extended influence functions of the associated underlying estimators. An interesting application of the influence function and its...and related filter smtctures. While the influence function is best known for its role in characterizing the robustness of estimators. the mathematical...statistics can be designed and analyzed for performance using the influence function as a tool. In particular, we have examined the mean-median

  19. Recent developments in the use of acoustic sensors and signal processing tools to target early infestations of Red Palm Weevil in agricultural environments

    USDA-ARS?s Scientific Manuscript database

    Much of the damage caused by red palm weevil larvae to date palms, ornamental palms, and palm offshoots could be mitigated by early detection and treatment of infestations. Acoustic technology has potential to enable early detection, but the short, high-frequency sound impulses produced by red palm ...

  20. Simulation Tools Prevent Signal Interference on Spacecraft

    NASA Technical Reports Server (NTRS)

    2014-01-01

    NASA engineers use simulation software to detect and prevent interference between different radio frequency (RF) systems on a rocket and satellite before launch. To speed up the process, Kennedy Space Center awarded SBIR funding to Champaign, Illinois-based Delcross Technologies LLC, which added a drag-and-drop feature to its commercial simulation software, resulting in less time spent preparing for the analysis.

  1. On the use of EEG or MEG brain imaging tools in neuromarketing research.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.

  2. On the Use of EEG or MEG Brain Imaging Tools in Neuromarketing Research

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries. PMID:21960996

  3. Topological properties of flat electroencephalography's state space

    NASA Astrophysics Data System (ADS)

    Ken, Tan Lit; Ahmad, Tahir bin; Mohd, Mohd Sham bin; Ngien, Su Kong; Suwa, Tohru; Meng, Ong Sie

    2016-02-01

    Neuroinverse problem are often associated with complex neuronal activity. It involves locating problematic cell which is highly challenging. While epileptic foci localization is possible with the aid of EEG signals, it relies greatly on the ability to extract hidden information or pattern within EEG signals. Flat EEG being an enhancement of EEG is a way of viewing electroencephalograph on the real plane. In the perspective of dynamical systems, Flat EEG is equivalent to epileptic seizure hence, making it a great platform to study epileptic seizure. Throughout the years, various mathematical tools have been applied on Flat EEG to extract hidden information that is hardly noticeable by traditional visual inspection. While these tools have given worthy results, the journey towards understanding seizure process completely is yet to be succeeded. Since the underlying structure of Flat EEG is dynamic and is deemed to contain wealthy information regarding brainstorm, it would certainly be appealing to explore in depth its structures. To better understand the complex seizure process, this paper studies the event of epileptic seizure via Flat EEG in a more general framework by means of topology, particularly, on the state space where the event of Flat EEG lies.

  4. Review of functional near-infrared spectroscopy in neurorehabilitation

    PubMed Central

    Mihara, Masahito; Miyai, Ichiro

    2016-01-01

    Abstract. We provide a brief overview of the research and clinical applications of near-infrared spectroscopy (NIRS) in the neurorehabilitation field. NIRS has several potential advantages and shortcomings as a neuroimaging tool and is suitable for research application in the rehabilitation field. As one of the main applications of NIRS, we discuss its application as a monitoring tool, including investigating the neural mechanism of functional recovery after brain damage and investigating the neural mechanisms for controlling bipedal locomotion and postural balance in humans. In addition to being a monitoring tool, advances in signal processing techniques allow us to use NIRS as a therapeutic tool in this field. With a brief summary of recent studies investigating the clinical application of NIRS using motor imagery task, we discuss the possible clinical usage of NIRS in brain–computer interface and neurofeedback. PMID:27429995

  5. Low-power, high-speed 1-bit inexact Full Adder cell designs applicable to low-energy image processing

    NASA Astrophysics Data System (ADS)

    Zareei, Zahra; Navi, Keivan; Keshavarziyan, Peiman

    2018-03-01

    In this paper, three novel low-power and high-speed 1-bit inexact Full Adder cell designs are presented based on current mode logic in 32 nm carbon nanotube field effect transistor technology for the first time. The circuit-level figures of merits, i.e. power, delay and power-delay product as well as application-level metric such as error distance, are considered to assess the efficiency of the proposed cells over their counterparts. The effect of voltage scaling and temperature variation on the proposed cells is studied using HSPICE tool. Moreover, using MATLAB tool, the peak signal to noise ratio of the proposed cells is evaluated in an image-processing application referred to as motion detector. Simulation results confirm the efficiency of the proposed cells.

  6. Extended wavelet transformation to digital holographic reconstruction: application to the elliptical, astigmatic Gaussian beams.

    PubMed

    Remacha, Clément; Coëtmellec, Sébastien; Brunel, Marc; Lebrun, Denis

    2013-02-01

    Wavelet analysis provides an efficient tool in numerous signal processing problems and has been implemented in optical processing techniques, such as in-line holography. This paper proposes an improvement of this tool for the case of an elliptical, astigmatic Gaussian (AEG) beam. We show that this mathematical operator allows reconstructing an image of a spherical particle without compression of the reconstructed image, which increases the accuracy of the 3D location of particles and of their size measurement. To validate the performance of this operator we have studied the diffraction pattern produced by a particle illuminated by an AEG beam. This study used mutual intensity propagation, and the particle is defined as a chirped Gaussian sum. The proposed technique was applied and the experimental results are presented.

  7. A New Tool for Industry

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Ultrasonic P2L2 bolt monitor is a new industrial tool, developed at Langley Research Laboratory, which is lightweight, portable, extremely accurate because it is not subject to friction error, and it is cost-competitive with the least expensive of other types of accurate strain monitors. P2L2 is an acronym for Pulse Phase Locked Loop. The ultrasound system which measures the stress that occurs when a bolt becomes elongated in the process of tightening, transmits sound waves to the bolt being fastened and receives a return signal indicating changes in bolt stress. Results are translated into a digital reading of the actual stress on the bolt. Device monitors the bolt tensioning process on mine roof bolts that provide increased safety within the mine. Also has utility in industrial applications.

  8. Noise in any frequency range can enhance information transmission in a sensory neuron

    NASA Astrophysics Data System (ADS)

    Levin, Jacob E.

    1997-05-01

    The effect of noise on the neural encoding of broadband signals was investigated in the cricket cercal system, a mechanosensory system sensitive to small near-field air particle disturbances. Known air current stimuli were presented to the cricket through audio speakers in a controlled environment in a variety of background noise conditions. Spike trains from the second layer of neuronal processing, the primary sensory interneurons, were recorded with intracellular Electrodes and the performance of these neurons characterized with the tools of information theory. SNR, mutual information rates, and other measures of encoding accuracy were calculated for single frequency, narrowband, and broadband signals over the entire amplitude sensitivity range of the cells, in the presence of uncorrelated noise background also spanning the cells' frequency and amplitude sensitivity range. Significant enhancements of transmitted information through the addition of external noise were observed regardless of the frequency range of either the signal or noise waveforms, provided both were within the operating range of the cell. Considerable improvements in signal encoding were observed for almost an entire order of magnitude of near-threshold signal amplitudes. This included sinusoidal signals embedded in broadband white noise, broadband signals in broadband noise, and even broadband signals presented with narrowband noise in a completely non-overlapping frequency range. The noise related increases in mutual information rate for broadband signals were as high as 150%, and up to 600% increases in SNR were observed for sinusoidal signals. Additionally, it was shown that the amount of information about the signal carried, on average, by each spike was INCREASED for small signals when presented with noise—implying that added input noise can, in certain situations, actually improve the accuracy of the encoding process itself.

  9. Method and apparatus for transmitting and receiving data to and from a downhole tool

    DOEpatents

    Hall, David R.; Fox, Joe

    2007-03-13

    A transmission line network system for transmitting and/or receiving data from a downhole tool. The invention is achieved by providing one or more transceiving elements, preferably rings, at either end of a downhole tool. A conduit containing a coaxial cable capable of communicating an electrical signal is attached to the transceiving element and extends through a central bore of the downhole tool and through the central bore of any tool intermediate the first transceiving element and a second transceiving element. Upon receiving an electrical signal from the cable, the second transceiving element may convert such signal to a magnetic field. The magnetic field may be detected by a third transceiving element in close proximity to the second transceiving element. In this manner, many different tools may be included in a downhole transmission network without requiring substantial modification, if any, of any particular tool.

  10. Ground Penetrating Radar Imaging of Ancient Clastic Deposits: A Tool for Three-Dimensional Outcrop Studies

    NASA Astrophysics Data System (ADS)

    Akinpelu, Oluwatosin Caleb

    The growing need for better definition of flow units and depositional heterogeneities in petroleum reservoirs and aquifers has stimulated a renewed interest in outcrop studies as reservoir analogues in the last two decades. Despite this surge in interest, outcrop studies remain largely two-dimensional; a major limitation to direct application of outcrop knowledge to the three dimensional heterogeneous world of subsurface reservoirs. Behind-outcrop Ground Penetrating Radar (GPR) imaging provides high-resolution geophysical data, which when combined with two dimensional architectural outcrop observation, becomes a powerful interpretation tool. Due to the high resolution, non-destructive and non-invasive nature of the GPR signal, as well as its reflection-amplitude sensitivity to shaly lithologies, three-dimensional outcrop studies combining two dimensional architectural element data and behind-outcrop GPR imaging hold significant promise with the potential to revolutionize outcrop studies the way seismic imaging changed basin analysis. Earlier attempts at GPR imaging on ancient clastic deposits were fraught with difficulties resulting from inappropriate field techniques and subsequent poorly-informed data processing steps. This project documents advances in GPR field methodology, recommends appropriate data collection and processing procedures and validates the value of integrating outcrop-based architectural-element mapping with GPR imaging to obtain three dimensional architectural data from outcrops. Case studies from a variety of clastic deposits: Whirlpool Formation (Niagara Escarpment), Navajo Sandstone (Moab, Utah), Dunvegan Formation (Pink Mountain, British Columbia), Chinle Formation (Southern Utah) and St. Mary River Formation (Alberta) demonstrate the usefulness of this approach for better interpretation of outcrop scale ancient depositional processes and ultimately as a tool for refining existing facies models, as well as a predictive tool for subsurface reservoir modelling. While this approach is quite promising for detailed three-dimensional outcrop studies, it is not an all-purpose panacea; thick overburden, poor antenna-ground coupling in rough terrains typical of outcrops, low penetration and rapid signal attenuation in mudstone and diagenetic clay- rich deposits often limit the prospects of this novel technique.

  11. Seizure classification in EEG signals utilizing Hilbert-Huang transform

    PubMed Central

    2011-01-01

    Background Classification method capable of recognizing abnormal activities of the brain functionality are either brain imaging or brain signal analysis. The abnormal activity of interest in this study is characterized by a disturbance caused by changes in neuronal electrochemical activity that results in abnormal synchronous discharges. The method aims at helping physicians discriminate between healthy and seizure electroencephalographic (EEG) signals. Method Discrimination in this work is achieved by analyzing EEG signals obtained from freely accessible databases. MATLAB has been used to implement and test the proposed classification algorithm. The analysis in question presents a classification of normal and ictal activities using a feature relied on Hilbert-Huang Transform. Through this method, information related to the intrinsic functions contained in the EEG signal has been extracted to track the local amplitude and the frequency of the signal. Based on this local information, weighted frequencies are calculated and a comparison between ictal and seizure-free determinant intrinsic functions is then performed. Methods of comparison used are the t-test and the Euclidean clustering. Results The t-test results in a P-value < 0.02 and the clustering leads to accurate (94%) and specific (96%) results. The proposed method is also contrasted against the Multivariate Empirical Mode Decomposition that reaches 80% accuracy. Comparison results strengthen the contribution of this paper not only from the accuracy point of view but also with respect to its fast response and ease to use. Conclusion An original tool for EEG signal processing giving physicians the possibility to diagnose brain functionality abnormalities is presented in this paper. The proposed system bears the potential of providing several credible benefits such as fast diagnosis, high accuracy, good sensitivity and specificity, time saving and user friendly. Furthermore, the classification of mode mixing can be achieved using the extracted instantaneous information of every IMF, but it would be most likely a hard task if only the average value is used. Extra benefits of this proposed system include low cost, and ease of interface. All of that indicate the usefulness of the tool and its use as an efficient diagnostic tool. PMID:21609459

  12. Seizure classification in EEG signals utilizing Hilbert-Huang transform.

    PubMed

    Oweis, Rami J; Abdulhay, Enas W

    2011-05-24

    Classification method capable of recognizing abnormal activities of the brain functionality are either brain imaging or brain signal analysis. The abnormal activity of interest in this study is characterized by a disturbance caused by changes in neuronal electrochemical activity that results in abnormal synchronous discharges. The method aims at helping physicians discriminate between healthy and seizure electroencephalographic (EEG) signals. Discrimination in this work is achieved by analyzing EEG signals obtained from freely accessible databases. MATLAB has been used to implement and test the proposed classification algorithm. The analysis in question presents a classification of normal and ictal activities using a feature relied on Hilbert-Huang Transform. Through this method, information related to the intrinsic functions contained in the EEG signal has been extracted to track the local amplitude and the frequency of the signal. Based on this local information, weighted frequencies are calculated and a comparison between ictal and seizure-free determinant intrinsic functions is then performed. Methods of comparison used are the t-test and the Euclidean clustering. The t-test results in a P-value < 0.02 and the clustering leads to accurate (94%) and specific (96%) results. The proposed method is also contrasted against the Multivariate Empirical Mode Decomposition that reaches 80% accuracy. Comparison results strengthen the contribution of this paper not only from the accuracy point of view but also with respect to its fast response and ease to use. An original tool for EEG signal processing giving physicians the possibility to diagnose brain functionality abnormalities is presented in this paper. The proposed system bears the potential of providing several credible benefits such as fast diagnosis, high accuracy, good sensitivity and specificity, time saving and user friendly. Furthermore, the classification of mode mixing can be achieved using the extracted instantaneous information of every IMF, but it would be most likely a hard task if only the average value is used. Extra benefits of this proposed system include low cost, and ease of interface. All of that indicate the usefulness of the tool and its use as an efficient diagnostic tool.

  13. Recent advances in synthetic biology of cyanobacteria.

    PubMed

    Sengupta, Annesha; Pakrasi, Himadri B; Wangikar, Pramod P

    2018-05-09

    Cyanobacteria are attractive hosts that can be engineered for the photosynthetic production of fuels, fine chemicals, and proteins from CO 2 . Moreover, the responsiveness of these photoautotrophs towards different environmental signals, such as light, CO 2 , diurnal cycle, and metals make them potential hosts for the development of biosensors. However, engineering these hosts proves to be a challenging and lengthy process. Synthetic biology can make the process of biological engineering more predictable through the use of standardized biological parts that are well characterized and tools to assemble them. While significant progress has been made with model heterotrophic organisms, many of the parts and tools are not portable in cyanobacteria. Therefore, efforts are underway to develop and characterize parts derived from cyanobacteria. In this review, we discuss the reported parts and tools with the objective to develop cyanobacteria as cell factories or biosensors. We also discuss the issues related to characterization, tunability, portability, and the need to develop enabling technologies to engineer this "green" chassis.

  14. Battlefield decision aid for acoustical ground sensors with interface to meteorological data sources

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Noble, John M.; VanAartsen, Bruce H.; Szeto, Gregory L.

    2001-08-01

    The performance of acoustical ground sensors depends heavily on the local atmospheric and terrain conditions. This paper describes a prototype physics-based decision aid, called the Acoustic Battlefield Aid (ABFA), for predicting these environ-mental effects. ABFA integrates advanced models for acoustic propagation, atmospheric structure, and array signal process-ing into a convenient graphical user interface. The propagation calculations are performed in the frequency domain on user-definable target spectra. The solution method involves a parabolic approximation to the wave equation combined with a ter-rain diffraction model. Sensor performance is characterized with Cramer-Rao lower bounds (CRLBs). The CRLB calcula-tions include randomization of signal energy and wavefront orientation resulting from atmospheric turbulence. Available performance characterizations include signal-to-noise ratio, probability of detection, direction-finding accuracy for isolated receiving arrays, and location-finding accuracy for networked receiving arrays. A suite of integrated tools allows users to create new target descriptions from standard digitized audio files and to design new sensor array layouts. These tools option-ally interface with the ARL Database/Automatic Target Recognition (ATR) Laboratory, providing access to an extensive library of target signatures. ABFA also includes a Java-based capability for network access of near real-time data from sur-face weather stations or forecasts from the Army's Integrated Meteorological System. As an example, the detection footprint of an acoustical sensor, as it evolves over a 13-hour period, is calculated.

  15. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  16. A G Protein-biased Designer G Protein-coupled Receptor Useful for Studying the Physiological Relevance of Gq/11-dependent Signaling Pathways.

    PubMed

    Hu, Jianxin; Stern, Matthew; Gimenez, Luis E; Wanka, Lizzy; Zhu, Lu; Rossi, Mario; Meister, Jaroslawna; Inoue, Asuka; Beck-Sickinger, Annette G; Gurevich, Vsevolod V; Wess, Jürgen

    2016-04-08

    Designerreceptorsexclusivelyactivated by adesignerdrug (DREADDs) are clozapine-N-oxide-sensitive designer G protein-coupled receptors (GPCRs) that have emerged as powerful novel chemogenetic tools to study the physiological relevance of GPCR signaling pathways in specific cell types or tissues. Like endogenous GPCRs, clozapine-N-oxide-activated DREADDs do not only activate heterotrimeric G proteins but can also trigger β-arrestin-dependent (G protein-independent) signaling. To dissect the relative physiological relevance of G protein-mediatedversusβ-arrestin-mediated signaling in different cell types or physiological processes, the availability of G protein- and β-arrestin-biased DREADDs would be highly desirable. In this study, we report the development of a mutationally modified version of a non-biased DREADD derived from the M3muscarinic receptor that can activate Gq/11with high efficacy but lacks the ability to interact with β-arrestins. We also demonstrate that this novel DREADD is activein vivoand that cell type-selective expression of this new designer receptor can provide novel insights into the physiological roles of G protein (Gq/11)-dependentversusβ-arrestin-dependent signaling in hepatocytes. Thus, this novel Gq/11-biased DREADD represents a powerful new tool to study the physiological relevance of Gq/11-dependent signaling in distinct tissues and cell types, in the absence of β-arrestin-mediated cellular effects. Such studies should guide the development of novel classes of functionally biased ligands that show high efficacy in various pathophysiological conditions but display a reduced incidence of side effects. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. Angular analysis of the cyclic impacting oscillations in a robotic grinding process

    NASA Astrophysics Data System (ADS)

    Rafieian, Farzad; Girardin, François; Liu, Zhaoheng; Thomas, Marc; Hazel, Bruce

    2014-02-01

    In a robotic machining process, a light-weight cutter or grinder is usually held by an articulated robot arm. Material removal is achieved by the rotating cutting tool while the robot end effector ensures that the tool follows a programmed trajectory in order to work on complex curved surfaces or to access hard-to-reach areas. One typical application of such process is maintenance and repair work on hydropower equipment. This paper presents an experimental study of the dynamic characteristics of material removal in robotic grinding, which is unlike conventional grinding due to the lower structural stiffness of the tool-holder robot. The objective of the study is to explore the cyclic nature of this mechanical operation to provide the basis for future development of better process control strategies. Grinding tasks that minimize the number of iterations to converge to the target surface can be better planned based on a good understanding and modeling of the cyclic material removal mechanism. A single degree of freedom dynamic analysis of the process suggests that material removal is performed through high-frequency impacts that mainly last for only a small fraction of the grinding disk rotation period. To detect these discrete cutting events in practice, a grinder is equipped with a rotary encoder. The encoder's signal is acquired through the angular sampling technique. A running cyclic synchronous average is applied to the speed signal to remove its non-cyclic events. The measured instantaneous rotational frequency clearly indicates the impacting nature of the process and captures the transient response excited by these cyclic impacts. The technique also locates the angular positions of cutting impacts in revolution cycles. It is thus possible to draw conclusions about the cyclic nature of dynamic changes in impact-cutting behavior when grinding with a flexible robot. The dynamics of the impacting regime and transient responses to impact-cutting excitations captured synchronously using the angular sampling technique provide feedback that can be used to regulate the material removal process. The experimental results also make it possible to correlate the energy required to remove a chip of metal through impacting with the measured drop in angular speed during grinding.

  18. Nested effects models for learning signaling networks from perturbation data.

    PubMed

    Fröhlich, Holger; Tresch, Achim; Beissbarth, Tim

    2009-04-01

    Targeted gene perturbations have become a major tool to gain insight into complex cellular processes. In combination with the measurement of downstream effects via DNA microarrays, this approach can be used to gain insight into signaling pathways. Nested Effects Models were first introduced by Markowetz et al. as a probabilistic method to reverse engineer signaling cascades based on the nested structure of downstream perturbation effects. The basic framework was substantially extended later on by Fröhlich et al., Markowetz et al., and Tresch and Markowetz. In this paper, we present a review of the complete methodology with a detailed comparison of so far proposed algorithms on a qualitative and quantitative level. As an application, we present results on estimating the signaling network between 13 genes in the ER-alpha pathway of human MCF-7 breast cancer cells. Comparison with the literature shows a substantial overlap.

  19. Atomic layer deposition modified track-etched conical nanochannels for protein sensing.

    PubMed

    Wang, Ceming; Fu, Qibin; Wang, Xinwei; Kong, Delin; Sheng, Qian; Wang, Yugang; Chen, Qiang; Xue, Jianming

    2015-08-18

    Nanopore-based devices have recently become popular tools to detect biomolecules at the single-molecule level. Unlike the long-chain nucleic acids, protein molecules are still quite challenging to detect, since the protein molecules are much smaller in size and usually travel too fast through the nanopore with poor signal-to-noise ratio of the induced transport signals. In this work, we demonstrate a new type of nanopore device based on atomic layer deposition (ALD) Al2O3 modified track-etched conical nanochannels for protein sensing. These devices show very promising properties of high protein (bovine serum albumin) capture rate with well time-resolved transport signals and excellent signal-to-noise ratio for the transport events. Also, a special mechanism involving transient process of ion redistribution inside the nanochannel is proposed to explain the unusual biphasic waveshapes of the current change induced by the protein transport.

  20. Fault diagnosis of motor bearing with speed fluctuation via angular resampling of transient sound signals

    NASA Astrophysics Data System (ADS)

    Lu, Siliang; Wang, Xiaoxian; He, Qingbo; Liu, Fang; Liu, Yongbin

    2016-12-01

    Transient signal analysis (TSA) has been proven an effective tool for motor bearing fault diagnosis, but has yet to be applied in processing bearing fault signals with variable rotating speed. In this study, a new TSA-based angular resampling (TSAAR) method is proposed for fault diagnosis under speed fluctuation condition via sound signal analysis. By applying the TSAAR method, the frequency smearing phenomenon is eliminated and the fault characteristic frequency is exposed in the envelope spectrum for bearing fault recognition. The TSAAR method can accurately estimate the phase information of the fault-induced impulses using neither complicated time-frequency analysis techniques nor external speed sensors, and hence it provides a simple, flexible, and data-driven approach that realizes variable-speed motor bearing fault diagnosis. The effectiveness and efficiency of the proposed TSAAR method are verified through a series of simulated and experimental case studies.

  1. Magnetoencephalographic imaging of deep corticostriatal network activity during a rewards paradigm.

    PubMed

    Kanal, Eliezer Y; Sun, Mingui; Ozkurt, Tolga E; Jia, Wenyan; Sclabassi, Robert

    2009-01-01

    The human rewards network is a complex system spanning both cortical and subcortical regions. While much is known about the functions of the various components of the network, research on the behavior of the network as a whole has been stymied due to an inability to detect signals at a high enough temporal resolution from both superficial and deep network components simultaneously. In this paper, we describe the application of magnetoencephalographic imaging (MEG) combined with advanced signal processing techniques to this problem. Using data collected while subjects performed a rewards-related gambling paradigm demonstrated to activate the rewards network, we were able to identify neural signals which correspond to deep network activity. We also show that this signal was not observable prior to filtration. These results suggest that MEG imaging may be a viable tool for the detection of deep neural activity.

  2. LSST camera readout chip ASPIC: test tools

    NASA Astrophysics Data System (ADS)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  3. FITPix COMBO—Timepix detector with integrated analog signal spectrometric readout

    NASA Astrophysics Data System (ADS)

    Holik, M.; Kraus, V.; Georgiev, V.; Granja, C.

    2016-02-01

    The hybrid semiconductor pixel detector Timepix has proven a powerful tool in radiation detection and imaging. Energy loss and directional sensitivity as well as particle type resolving power are possible by high resolution particle tracking and per-pixel energy and quantum-counting capability. The spectrometric resolving power of the detector can be further enhanced by analyzing the analog signal of the detector common sensor electrode (also called back-side pulse). In this work we present a new compact readout interface, based on the FITPix readout architecture, extended with integrated analog electronics for the detector's common sensor signal. Integrating simultaneous operation of the digital per-pixel information with the common sensor (called also back-side electrode) analog pulse processing circuitry into one device enhances the detector capabilities and opens new applications. Thanks to noise suppression and built-in electromagnetic interference shielding the common hardware platform enables parallel analog signal spectroscopy on the back side pulse signal with full operation and read-out of the pixelated digital part, the noise level is 600 keV and spectrometric resolution around 100 keV for 5.5 MeV alpha particles. Self-triggering is implemented with delay of few tens of ns making use of adjustable low-energy threshold of the particle analog signal amplitude. The digital pixelated full frame can be thus triggered and recorded together with the common sensor analog signal. The waveform, which is sampled with frequency 100 MHz, can be recorded in adjustable time window including time prior to the trigger level. An integrated software tool provides control, on-line display and read-out of both analog and digital channels. Both the pixelated digital record and the analog waveform are synchronized and written out by common time stamp.

  4. Ultrasonic friction power during Al wire wedge-wedge bonding

    NASA Astrophysics Data System (ADS)

    Shah, A.; Gaul, H.; Schneider-Ramelow, M.; Reichl, H.; Mayer, M.; Zhou, Y.

    2009-07-01

    Al wire bonding, also called ultrasonic wedge-wedge bonding, is a microwelding process used extensively in the microelectronics industry for interconnections to integrated circuits. The bonding wire used is a 25μm diameter AlSi1 wire. A friction power model is used to derive the ultrasonic friction power during Al wire bonding. Auxiliary measurements include the current delivered to the ultrasonic transducer, the vibration amplitude of the bonding tool tip in free air, and the ultrasonic force acting on the bonding pad during the bond process. The ultrasonic force measurement is like a signature of the bond as it allows for a detailed insight into mechanisms during various phases of the process. It is measured using piezoresistive force microsensors integrated close to the Al bonding pad (Al-Al process) on a custom made test chip. A clear break-off in the force signal is observed, which is followed by a relatively constant force for a short duration. A large second harmonic content is observed, describing a nonsymmetric deviation of the signal wave form from the sinusoidal shape. This deviation might be due to the reduced geometrical symmetry of the wedge tool. For bonds made with typical process parameters, several characteristic values used in the friction power model are determined. The ultrasonic compliance of the bonding system is 2.66μm/N. A typical maximum value of the relative interfacial amplitude of ultrasonic friction is at least 222nm. The maximum interfacial friction power is at least 11.5mW, which is only about 4.8% of the total electrical power delivered to the ultrasonic generator.

  5. SPIKE – a database, visualization and analysis tool of cellular signaling pathways

    PubMed Central

    Elkon, Ran; Vesterman, Rita; Amit, Nira; Ulitsky, Igor; Zohar, Idan; Weisz, Mali; Mass, Gilad; Orlev, Nir; Sternberg, Giora; Blekhman, Ran; Assa, Jackie; Shiloh, Yosef; Shamir, Ron

    2008-01-01

    Background Biological signaling pathways that govern cellular physiology form an intricate web of tightly regulated interlocking processes. Data on these regulatory networks are accumulating at an unprecedented pace. The assimilation, visualization and interpretation of these data have become a major challenge in biological research, and once met, will greatly boost our ability to understand cell functioning on a systems level. Results To cope with this challenge, we are developing the SPIKE knowledge-base of signaling pathways. SPIKE contains three main software components: 1) A database (DB) of biological signaling pathways. Carefully curated information from the literature and data from large public sources constitute distinct tiers of the DB. 2) A visualization package that allows interactive graphic representations of regulatory interactions stored in the DB and superposition of functional genomic and proteomic data on the maps. 3) An algorithmic inference engine that analyzes the networks for novel functional interplays between network components. SPIKE is designed and implemented as a community tool and therefore provides a user-friendly interface that allows registered users to upload data to SPIKE DB. Our vision is that the DB will be populated by a distributed and highly collaborative effort undertaken by multiple groups in the research community, where each group contributes data in its field of expertise. Conclusion The integrated capabilities of SPIKE make it a powerful platform for the analysis of signaling networks and the integration of knowledge on such networks with omics data. PMID:18289391

  6. Application of spectral decomposition of ²²²Rn activity concentration signal series measured in Niedźwiedzia Cave to identification of mechanisms responsible for different time-period variations.

    PubMed

    Przylibski, Tadeusz Andrzej; Wyłomańska, Agnieszka; Zimroz, Radosław; Fijałkowska-Lichwa, Lidia

    2015-10-01

    The authors present an application of spectral decomposition of (222)Rn activity concentration signal series as a mathematical tool used for distinguishing processes determining temporal changes of radon concentration in cave air. The authors demonstrate that decomposition of monitored signal such as (222)Rn activity concentration in cave air facilitates characterizing the processes affecting changes in the measured concentration of this gas. Thanks to this, one can better correlate and characterize the influence of various processes on radon behaviour in cave air. Distinguishing and characterising these processes enables the understanding of radon behaviour in cave environment and it may also enable and facilitate using radon as a precursor of geodynamic phenomena in the lithosphere. Thanks to the conducted analyses, the authors confirmed the unquestionable influence of convective air exchange between the cave and the atmosphere on seasonal and short-term (diurnal) changes in (222)Rn activity concentration in cave air. Thanks to the applied methodology of signal analysis and decomposition, the authors also identified a third process affecting (222)Rn activity concentration changes in cave air. This is a deterministic process causing changes in radon concentration, with a distribution different from the Gaussian one. The authors consider these changes to be the effect of turbulent air movements caused by the movement of visitors in caves. This movement is heterogeneous in terms of the number of visitors per group and the number of groups visiting a cave per day and per year. Such a process perfectly elucidates the observed character of the registered changes in (222)Rn activity concentration in one of the decomposed components of the analysed signal. The obtained results encourage further research into precise relationships between the registered (222)Rn activity concentration changes and factors causing them, as well as into using radon as a precursor of geodynamic phenomena in the lithosphere. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  8. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  9. Analysis of acoustic emission signals at austempering of steels using neural networks

    NASA Astrophysics Data System (ADS)

    Łazarska, Malgorzata; Wozniak, Tadeusz Z.; Ranachowski, Zbigniew; Trafarski, Andrzej; Domek, Grzegorz

    2017-05-01

    Bearing steel 100CrMnSi6-4 and tool steel C105U were used to carry out this research with the steels being austempered to obtain a martensitic-bainitic structure. During the process quite a large number of acoustic emissions (AE) were observed. These signals were then analysed using neural networks resulting in the identification of three groups of events of: high, medium and low energy and in addition their spectral characteristics were plotted. The results were presented in the form of diagrams of AE incidence as a function of time. It was demonstrated that complex transformations of austenite into martensite and bainite occurred when austempering bearing steel at 160 °C and tool steel at 130 °C respectively. The selected temperatures of isothermal quenching of the tested steels were within the area near to MS temperature, which affected the complex course of phase transition. The high activity of AE is a typical occurrence for martensitic transformation and this is the transformation mechanism that induces the generation of AE signals of higher energy in the first stage of transition. In the second stage of transformation, the initially nucleated martensite accelerates the occurrence of the next bainitic transformation.

  10. Small molecules targeting heterotrimeric G proteins.

    PubMed

    Ayoub, Mohammed Akli

    2018-05-05

    G protein-coupled receptors (GPCRs) represent the largest family of cell surface receptors regulating many human and animal physiological functions. Their implication in human pathophysiology is obvious with almost 30-40% medical drugs commercialized today directly targeting GPCRs as molecular entities. However, upon ligand binding GPCRs signal inside the cell through many key signaling, adaptor and regulatory proteins, including various classes of heterotrimeric G proteins. Therefore, G proteins are considered interesting targets for the development of pharmacological tools that are able to modulate their interaction with the receptors, as well as their activation/deactivation processes. In this review, old attempts and recent advances in the development of small molecules that directly target G proteins will be described with an emphasis on their utilization as pharmacological tools to dissect the mechanisms of activation of GPCR-G protein complexes. These molecules constitute a further asset for research in the "hot" areas of GPCR biology, areas such as multiple G protein coupling/signaling, GPCR-G protein preassembly, and GPCR functional selectivity or bias. Moreover, this review gives a particular focus on studies in vitro and in vivo supporting the potential applications of such small molecules in various GPCR/G protein-related diseases. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment

    NASA Astrophysics Data System (ADS)

    Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M.

    2017-02-01

    Nowadays, the vibration analysis of rotating machine signals is a well-established methodology, rooted on powerful tools offered, in particular, by the theory of cyclostationary (CS) processes. Among them, the squared envelope spectrum (SES) is probably the most popular to detect random CS components which are typical symptoms, for instance, of rolling element bearing faults. Recent researches are shifted towards the extension of existing CS tools - originally devised in constant speed conditions - to the case of variable speed conditions. Many of these works combine the SES with computed order tracking after some preprocessing steps. The principal object of this paper is to organize these dispersed researches into a structured comprehensive framework. Three original features are furnished. First, a model of rotating machine signals is introduced which sheds light on the various components to be expected in the SES. Second, a critical comparison is made of three sophisticated methods, namely, the improved synchronous average, the cepstrum prewhitening, and the generalized synchronous average, used for suppressing the deterministic part. Also, a general envelope enhancement methodology which combines the latter two techniques with a time-domain filtering operation is revisited. All theoretical findings are experimentally validated on simulated and real-world vibration signals.

  12. Bio-inspired benchmark generator for extracellular multi-unit recordings

    PubMed Central

    Mondragón-González, Sirenia Lizbeth; Burguière, Eric

    2017-01-01

    The analysis of multi-unit extracellular recordings of brain activity has led to the development of numerous tools, ranging from signal processing algorithms to electronic devices and applications. Currently, the evaluation and optimisation of these tools are hampered by the lack of ground-truth databases of neural signals. These databases must be parameterisable, easy to generate and bio-inspired, i.e. containing features encountered in real electrophysiological recording sessions. Towards that end, this article introduces an original computational approach to create fully annotated and parameterised benchmark datasets, generated from the summation of three components: neural signals from compartmental models and recorded extracellular spikes, non-stationary slow oscillations, and a variety of different types of artefacts. We present three application examples. (1) We reproduced in-vivo extracellular hippocampal multi-unit recordings from either tetrode or polytrode designs. (2) We simulated recordings in two different experimental conditions: anaesthetised and awake subjects. (3) Last, we also conducted a series of simulations to study the impact of different level of artefacts on extracellular recordings and their influence in the frequency domain. Beyond the results presented here, such a benchmark dataset generator has many applications such as calibration, evaluation and development of both hardware and software architectures. PMID:28233819

  13. Monitoring of Surface Roughness in Aluminium Turning Process

    NASA Astrophysics Data System (ADS)

    Chaijareenont, Atitaya; Tangjitsitcharoen, Somkiat

    2018-01-01

    As the turning process is one of the most necessary process. The surface roughness has been considered for the quality of workpiece. There are many factors which affect the surface roughness. Hence, the objective of this research is to monitor the relation between the surface roughness and the cutting forces in aluminium turning process with a wide range of cutting conditions. The coated carbide tool and aluminium alloy (Al 6063) are used for this experiment. The cutting parameters are investigated to analyze the effects of them on the surface roughness which are the cutting speed, the feed rate, the tool nose radius and the depth of cut. In the case of this research, the dynamometer is installed in the turret of CNC turning machine to generate a signal while turning. The relation between dynamic cutting forces and the surface roughness profile is examined by applying the Fast Fourier Transform (FFT). The experimentally obtained results showed that the cutting force depends on the cutting condition. The surface roughness can be improved when increasing the cutting speed and the tool nose radius in contrast to the feed rate and the depth of cut. The relation between the cutting parameters and the surface roughness can be explained by the in-process cutting forces. It is understood that the in-process cutting forces are able to predict the surface roughness in the further research.

  14. A neuromorphic VLSI device for implementing 2-D selective attention systems.

    PubMed

    Indiveri, G

    2001-01-01

    Selective attention is a mechanism used to sequentially select and process salient subregions of the input space, while suppressing inputs arriving from nonsalient regions. By processing small amounts of sensory information in a serial fashion, rather than attempting to process all the sensory data in parallel, this mechanism overcomes the problem of flooding limited processing capacity systems with sensory inputs. It is found in many biological systems and can be a useful engineering tool for developing artificial systems that need to process in real-time sensory data. In this paper we present a neuromorphic hardware model of a selective attention mechanism implemented on a very large scale integration (VLSI) chip, using analog circuits. The chip makes use of a spike-based representation for receiving input signals, transmitting output signals and for shifting the selection of the attended input stimulus over time. It can be interfaced to neuromorphic sensors and actuators, for implementing multichip selective attention systems. We describe the characteristics of the circuits used in the architecture and present experimental data measured from the system.

  15. Acoustic emission analysis for the detection of appropriate cutting operations in honing processes

    NASA Astrophysics Data System (ADS)

    Buj-Corral, Irene; Álvarez-Flórez, Jesús; Domínguez-Fernández, Alejandro

    2018-01-01

    In the present paper, acoustic emission was studied in honing experiments obtained with different abrasive densities, 15, 30, 45 and 60. In addition, 2D and 3D roughness, material removal rate and tool wear were determined. In order to treat the sound signal emitted during the machining process, two methods of analysis were compared: Fast Fourier Transform (FFT) and Hilbert Huang Transform (HHT). When density 15 is used, the number of cutting grains is insufficient to provide correct cutting, while clogging appears with densities 45 and 60. The results were confirmed by means of treatment of the sound signal. In addition, a new parameter S was defined as the relationship between energy in low and high frequencies contained within the emitted sound. The selected density of 30 corresponds to S values between 0.1 and 1. Correct cutting operations in honing processes are dependent on the density of the abrasive employed. The density value to be used can be selected by means of measurement and analysis of acoustic emissions during the honing operation. Thus, honing processes can be monitored without needing to stop the process.

  16. Design and implementation of a sigma delta technology based pulse oximeter's acquisition stage

    NASA Astrophysics Data System (ADS)

    Rossi, E. E.; Peñalva, A.; Schaumburg, F.

    2011-12-01

    Pulse oximetry is a widely used tool in medical practice for estimating patient's fraction of hemoglobin bonded to oxygen. Conventional oximetry presents limitations when changes in the baseline, or low amplitude of signals involved occur. The aim of this paper is to simultaneously solve these constraints and to simplify the circuitry needed, by using ΣΔ technology. For this purpose, a board for the acquisition of the needed signals was developed, together with a PC managed software which controls it, and displays and processes in real time the information acquired. Also laboratory and field tests where designed and executed to verify the performance of this equipment in adverse situations. A simple, robust and economic instrument was achieved, capable of obtaining signals even in situations where conventional oximetry fails.

  17. Recent developments in the use of acoustic sensors and signal processing tools to target early infestations of red palm weevil (Coleopter: Curculionidae) in agricultural environments

    USDA-ARS?s Scientific Manuscript database

    Much of the damage caused by red palm weevil larvae to date palms, ornamental palms, and palm offshoots could be mitigated by early detection and treatment of infestations. Acoustic technology has potential to enable early detection, but the short, high-frequency sound impulses produced by red palm ...

  18. Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data

    PubMed Central

    Wang, Yinxue; Shi, Guilai; Miller, David J.; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang

    2017-01-01

    Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP. PMID:28769780

  19. Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data.

    PubMed

    Wang, Yinxue; Shi, Guilai; Miller, David J; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang

    2017-01-01

    Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca 2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca 2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP.

  20. Novel Tool for Complete Digitization of Paper Electrocardiography Data.

    PubMed

    Ravichandran, Lakshminarayan; Harless, Chris; Shah, Amit J; Wick, Carson A; Mcclellan, James H; Tridandapani, Srini

    We present a Matlab-based tool to convert electrocardiography (ECG) information from paper charts into digital ECG signals. The tool can be used for long-term retrospective studies of cardiac patients to study the evolving features with prognostic value. To perform the conversion, we: 1) detect the graphical grid on ECG charts using grayscale thresholding; 2) digitize the ECG signal based on its contour using a column-wise pixel scan; and 3) use template-based optical character recognition to extract patient demographic information from the paper ECG in order to interface the data with the patients' medical record. To validate the digitization technique: 1) correlation between the digital signals and signals digitized from paper ECG are performed and 2) clinically significant ECG parameters are measured and compared from both the paper-based ECG signals and the digitized ECG. The validation demonstrates a correlation value of 0.85-0.9 between the digital ECG signal and the signal digitized from the paper ECG. There is a high correlation in the clinical parameters between the ECG information from the paper charts and digitized signal, with intra-observer and inter-observer correlations of 0.8-0.9 (p < 0.05), and kappa statistics ranging from 0.85 (inter-observer) to 1.00 (intra-observer). The important features of the ECG signal, especially the QRST complex and the associated intervals, are preserved by obtaining the contour from the paper ECG. The differences between the measures of clinically important features extracted from the original signal and the reconstructed signal are insignificant, thus highlighting the accuracy of this technique. Using this type of ECG digitization tool to carry out retrospective studies on large databases, which rely on paper ECG records, studies of emerging ECG features can be performed. In addition, this tool can be used to potentially integrate digitized ECG information with digital ECG analysis programs and with the patient's electronic medical record.

  1. Wireless Monitoring of the Height of Condensed Water in Steam Pipes

    NASA Technical Reports Server (NTRS)

    Lee, Hyeong Jae; Bar-Cohen, Yoseph; Lih, Shyh-Shiuh; Badescu, Mircea; Dingizian, Arsham; Takano, Nobuyuki; Blosiu, Julian O.

    2014-01-01

    A wireless health monitoring system has been developed for determining the height of water condensation in the steam pipes and the data acquisition is done remotely using a wireless network system. The developed system is designed to operate in the harsh environment encountered at manholes and the pipe high temperature of over 200 °C. The test method is an ultrasonic pulse-echo and the hardware includes a pulser, receiver and wireless modem for communication. Data acquisition and signal processing software were developed to determine the water height using adaptive signal processing and data communication that can be controlled while the hardware is installed in a manhole. A statistical decision-making tool is being developed based on the field test data to determine the height of in the condensed water under high noise conditions and other environmental factors.

  2. Feasibility of an ultra-low power digital signal processor platform as a basis for a fully implantable brain-computer interface system.

    PubMed

    Wang, Po T; Gandasetiawan, Keulanna; McCrimmon, Colin M; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An H

    2016-08-01

    A fully implantable brain-computer interface (BCI) can be a practical tool to restore independence to those affected by spinal cord injury. We envision that such a BCI system will invasively acquire brain signals (e.g. electrocorticogram) and translate them into control commands for external prostheses. The feasibility of such a system was tested by implementing its benchtop analogue, centered around a commercial, ultra-low power (ULP) digital signal processor (DSP, TMS320C5517, Texas Instruments). A suite of signal processing and BCI algorithms, including (de)multiplexing, Fast Fourier Transform, power spectral density, principal component analysis, linear discriminant analysis, Bayes rule, and finite state machine was implemented and tested in the DSP. The system's signal acquisition fidelity was tested and characterized by acquiring harmonic signals from a function generator. In addition, the BCI decoding performance was tested, first with signals from a function generator, and subsequently using human electroencephalogram (EEG) during eyes opening and closing task. On average, the system spent 322 ms to process and analyze 2 s of data. Crosstalk (<;-65 dB) and harmonic distortion (~1%) were minimal. Timing jitter averaged 49 μs per 1000 ms. The online BCI decoding accuracies were 100% for both function generator and EEG data. These results show that a complex BCI algorithm can be executed on an ULP DSP without compromising performance. This suggests that the proposed hardware platform may be used as a basis for future, fully implantable BCI systems.

  3. Photoacoustic spectroscopy of condensed matter

    NASA Technical Reports Server (NTRS)

    Somoano, R. B.

    1978-01-01

    Photoacoustic spectroscopy is a new analytical tool that provides a simple nondestructive technique for obtaining information about the electronic absorption spectrum of samples such as powders, semisolids, gels, and liquids. It can also be applied to samples which cannot be examined by conventional optical methods. Numerous applications of this technique in the field of inorganic and organic semiconductors, biology, and catalysis have been described. Among the advantages of photoacoustic spectroscopy, the signal is almost insensitive to light scattering by the sample and information can be obtained about nonradiative deactivation processes. Signal saturation, which can modify the intensity of individual absorption bands in special cases, is a drawback of the method.

  4. Wavelets and molecular structure

    NASA Astrophysics Data System (ADS)

    Carson, Mike

    1996-08-01

    The wavelet method offers possibilities for display, editing, and topological comparison of proteins at a user-specified level of detail. Wavelets are a mathematical tool that first found application in signal processing. The multiresolution analysis of a signal via wavelets provides a hierarchical series of `best' lower-resolution approximations. B-spline ribbons model the protein fold, with one control point per residue. Wavelet analysis sets limits on the information required to define the winding of the backbone through space, suggesting a recognizable fold is generated from a number of points equal to 1/4 or less the number of residues. Wavelets applied to surfaces and volumes show promise in structure-based drug design.

  5. A wideband, high-resolution spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Quirk, M. P.; Wilck, H. C.; Garyantes, M. F.; Grimm, M. J.

    1988-01-01

    A two-million-channel, 40 MHz bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory is described. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2 (sup 21) point Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis.

  6. A wide-band high-resolution spectrum analyzer.

    PubMed

    Quirk, M P; Garyantes, M F; Wilck, H C; Grimm, M J

    1988-12-01

    This paper describes a two-million-channel 40-MHz-bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2(21)-point, Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis and detection.

  7. Fluorescent Probes and Selective Inhibitors for Biological Studies of Hydrogen Sulfide- and Polysulfide-Mediated Signaling

    PubMed Central

    Takano, Yoko; Echizen, Honami

    2017-01-01

    Abstract Significance: Hydrogen sulfide (H2S) plays roles in many physiological processes, including relaxation of vascular smooth muscles, mediation of neurotransmission, inhibition of insulin signaling, and regulation of inflammation. Also, hydropersulfide (R−S−SH) and polysulfide (−S−Sn−S−) have recently been identified as reactive sulfur species (RSS) that regulate the bioactivities of multiple proteins via S-sulfhydration of cysteine residues (protein Cys−SSH) and show cytoprotection. Chemical tools such as fluorescent probes and selective inhibitors are needed to establish in detail the physiological roles of H2S and polysulfide. Recent Advances: Although many fluorescent probes for H2S are available, fluorescent probes for hydropersulfide and polysulfide have only recently been developed and used to detect these sulfur species in living cells. Critical Issues: In this review, we summarize recent progress in developing chemical tools for the study of H2S, hydropersulfide, and polysulfide, covering fluorescent probes based on various design strategies and selective inhibitors of H2S- and polysulfide-producing enzymes (cystathionine γ-lyase, cystathionine β-synthase, and 3-mercaptopyruvate sulfurtransferase), and we summarize their applications in biological studies. Future Directions: Despite recent progress, the precise biological functions of H2S, hydropersulfide, and polysulfide remain to be fully established. Fluorescent probes and selective inhibitors are effective chemical tools to study the physiological roles of these sulfur molecules in living cells and tissues. Therefore, further development of a broad range of practical fluorescent probes and selective inhibitors as tools for studies of RSS biology is currently attracting great interest. Antioxid. Redox Signal. 27, 669–683. PMID:28443673

  8. Carrier-interleaved orthogonal multi-electrode multi-carrier resistivity-measurement tool

    NASA Astrophysics Data System (ADS)

    Cai, Yu; Sha, Shuang

    2016-09-01

    This paper proposes a new carrier-interleaved orthogonal multi-electrode multi-carrier resistivity-measurement tool used in a cylindrical borehole environment during oil-based mud drilling processes. The new tool is an orthogonal frequency division multiplexing access-based contactless multi-measurand detection tool. The tool can measure formation resistivity in different azimuthal angles and elevational depths. It can measure many more measurands simultaneously in a specified bandwidth than the legacy frequency division multiplexing multi-measurand tool without a channel-select filter while avoiding inter-carrier interference. The paper also shows that formation resistivity is not sensitive to frequency in certain frequency bands. The average resistivity collected from N subcarriers can increase the measurement of the signal-to-noise ratio (SNR) by N times given no amplitude clipping in the current-injection electrode. If the clipping limit is taken into account, with the phase rotation of each single carrier, the amplitude peak-to-average ratio can be reduced by 3 times, and the SNR can achieve a 9/N times gain over the single-carrier system. The carrier-interleaving technique is also introduced to counter the carrier frequency offset (CFO) effect, where the CFO will cause inter-pad interference. A qualitative analysis and simulations demonstrate that block-interleaving performs better than tone-interleaving when coping with a large CFO. The theoretical analysis also suggests that increasing the subcarrier number can increase the measurement speed or enhance elevational resolution without sacrificing receiver performance. The complex orthogonal multi-pad multi-carrier resistivity logging tool, in which all subcarriers are complex signals, can provide a larger available subcarrier pool than other types of transceivers.

  9. The GEDI Performance Tool

    NASA Astrophysics Data System (ADS)

    Hancock, S.; Armston, J.; Tang, H.; Patterson, P. L.; Healey, S. P.; Marselis, S.; Duncanson, L.; Hofton, M. A.; Kellner, J. R.; Luthcke, S. B.; Sun, X.; Blair, J. B.; Dubayah, R.

    2017-12-01

    NASA's Global Ecosystem Dynamics Investigation will mount a multi-track, full-waveform lidar on the International Space Station (ISS) that is optimised for the measurement of forest canopy height and structure. GEDI will use ten laser tracks, two 10 mJ "power beams" and eight 5 mJ "coverage beams" to produce global (51.5oS to 51.5oN) maps of above ground biomass (AGB), canopy height, vegetation structure and other biophysical parameters. The mission has a requirement to generate a 1 km AGB map with 80% of pixels with ≤ 20% standard error or 20 Mg·ha-1, whichever is greater. To assess performance and compare to mission requirements, an end-to-end simulator has been developed. The simulator brings together tools to propagate the effects of measurement and sampling error on GEDI data products. The simulator allows us to evaluate the impact of instrument performance, ISS orbits, processing algorithms and losses of data that may occur due to clouds, snow, leaf-off conditions, and areas with an insufficient signal-to-noise ratio (SNR). By evaluating the consequences of operational decisions on GEDI data products, this tool provides a quantitative framework for decision-making and mission planning. Here we demonstrate the performance tool by using it to evaluate the trade-off between measurement and sampling error on the 1 km AGB data product. Results demonstrate that the use of coverage beams during the day (lowest GEDI SNR case) over very dense forests (>95% canopy cover) will result in some measurement bias. Omitting these low SNR cases increased the sampling error. Through this an SNR threshold for a given expected canopy cover can be set. The other applications of the performance tool are also discussed, such as assessing the impact of decisions made in the AGB modelling and signal processing stages on the accuracy of final data products.

  10. Ridding fMRI data of motion-related influences: Removal of signals with distinct spatial and physical bases in multiecho data.

    PubMed

    Power, Jonathan D; Plitt, Mark; Gotts, Stephen J; Kundu, Prantik; Voon, Valerie; Bandettini, Peter A; Martin, Alex

    2018-02-27

    "Functional connectivity" techniques are commonplace tools for studying brain organization. A critical element of these analyses is to distinguish variance due to neurobiological signals from variance due to nonneurobiological signals. Multiecho fMRI techniques are a promising means for making such distinctions based on signal decay properties. Here, we report that multiecho fMRI techniques enable excellent removal of certain kinds of artifactual variance, namely, spatially focal artifacts due to motion. By removing these artifacts, multiecho techniques reveal frequent, large-amplitude blood oxygen level-dependent (BOLD) signal changes present across all gray matter that are also linked to motion. These whole-brain BOLD signals could reflect widespread neural processes or other processes, such as alterations in blood partial pressure of carbon dioxide (pCO 2 ) due to ventilation changes. By acquiring multiecho data while monitoring breathing, we demonstrate that whole-brain BOLD signals in the resting state are often caused by changes in breathing that co-occur with head motion. These widespread respiratory fMRI signals cannot be isolated from neurobiological signals by multiecho techniques because they occur via the same BOLD mechanism. Respiratory signals must therefore be removed by some other technique to isolate neurobiological covariance in fMRI time series. Several methods for removing global artifacts are demonstrated and compared, and were found to yield fMRI time series essentially free of motion-related influences. These results identify two kinds of motion-associated fMRI variance, with different physical mechanisms and spatial profiles, each of which strongly and differentially influences functional connectivity patterns. Distance-dependent patterns in covariance are nearly entirely attributable to non-BOLD artifacts.

  11. The Run-up to Volcanic Eruption Unveiled by Forensic Petrology and Geophysical Observations

    NASA Astrophysics Data System (ADS)

    Rasmussen, D. J.; Plank, T. A.; Roman, D. C.

    2017-12-01

    Volcanoes often warn of impending eruptions. However, one of the greatest challenges in volcano research is translating precursory geophysical signals into physical magmatic processes. Petrology offers powerful tools to study eruption run-up that benefit from direct response to magmatic forcings. Developing these tools, and tying them to geophysical observations, will help us identify eruption triggers (e.g., magmatic recharge, gas build-up, tectonic events) and understand the significance of monitored signals of unrest. We present an overview of petrologic tools used for studying eruption run-up, highlighting results from our study of the 1999 eruption of Shishaldin volcano. Olivine crystals contain chemical gradients, the consequence of diffusion following magma mixing events, which is modeled to determine mixing timescales. Modeled timescales provide strong evidence for at least three mixing events, which were triggered by magmatic recharge. Petrologic barometers indicate these events occurred at very shallow depths (within the volcanic edifice). The first mixing event occurred nine months before eruption, which was signaled by a swarm of deep-long period earthquake. Minor recharge events followed over two months, which are indicated by a second deep-long period earthquake swarm and a change in the local stress orientation measured by shear-wave splitting. Following these events, the system was relatively quiet until a large mixing event occurred 45 days prior to eruption, which was heralded by a large earthquake (M5.2). Following this event, geophysical signals of unrest intensified and became continuous. The final mixing event, beginning roughly a week before eruption, represents the final perturbation to the system before eruption. Our findings point to a relatively long run-up, which was subtle at first and intensified several weeks before eruption. This study highlights the strong link between geophysical signals of volcanic unrest and magmatic events, and helps open the door for the application of forensic petrology to unmonitored eruptions.

  12. Communications During Critical Mission Operations: Preparing for InSight's Landing on Mars

    NASA Technical Reports Server (NTRS)

    Asmar, Sami; Oudrhiri, Kamal; Kurtik, Susan; Weinstein-Weiss, Stacy

    2014-01-01

    Radio communications with deep space missions are often taken for granted due to the impressively successful records since, for decades, the technology and infrastructure have been developed for ground and flight systems to optimize telemetry and commanding. During mission-critical events such as the entry, descent, and landing of a spacecraft on the surface of Mars, the signal's level and frequency dynamics vary significantly and typically exceed the threshold of the budgeted links. The challenge is increased when spacecraft shed antennas with heat shields and other hardware during those risky few minutes. We have in the past successfully received signals on Earth during critical events even ones not intended for ground reception. These included the UHF signal transmitted by Curiosity to Marsorbiting assets. Since NASA's Deep Space Network does not operate in the UHF band, large radio telescopes around the world are utilized. The Australian CSIRO Parkes Radio Telescope supported the Curiosity UHF signal reception and DSN receivers, tools, and expertise were used in the process. In preparation for the InSight mission's landing on Mars in 2016, preparations are underway to support the UHF communications. This paper presents communication scenarios with radio telescopes, and the DSN receiver and tools. It also discusses the usefulness of the real-time information content for better response time by the mission team towards successful mission operations.

  13. A Digital Lock-In Amplifier for Use at Temperatures of up to 200 °C

    PubMed Central

    Cheng, Jingjing; Xu, Yingjun; Wu, Lei; Wang, Guangwei

    2016-01-01

    Weak voltage signals cannot be reliably measured using currently available logging tools when these tools are subject to high-temperature (up to 200 °C) environments for prolonged periods. In this paper, we present a digital lock-in amplifier (DLIA) capable of operating at temperatures of up to 200 °C. The DLIA contains a low-noise instrument amplifier and signal acquisition and the corresponding signal processing electronics. The high-temperature stability of the DLIA is achieved by designing system-in-package (SiP) and multi-chip module (MCM) components with low thermal resistances. An effective look-up-table (LUT) method was developed for the lock-in amplifier algorithm, to decrease the complexity of the calculations and generate less heat than the traditional way. The performance of the design was tested by determining the linearity, gain, Q value, and frequency characteristic of the DLIA between 25 and 200 °C. The maximal nonlinear error in the linearity of the DLIA working at 200 °C was about 1.736% when the equivalent input was a sine wave signal with an amplitude of between 94.8 and 1896.0 nV and a frequency of 800 kHz. The tests showed that the DLIA proposed could work effectively in high-temperature environments up to 200 °C. PMID:27845710

  14. Can I solve my structure by SAD phasing? Planning an experiment, scaling data and evaluating the useful anomalous correlation and anomalous signal.

    PubMed

    Terwilliger, Thomas C; Bunkóczi, Gábor; Hung, Li Wei; Zwart, Peter H; Smith, Janet L; Akey, David L; Adams, Paul D

    2016-03-01

    A key challenge in the SAD phasing method is solving a structure when the anomalous signal-to-noise ratio is low. Here, algorithms and tools for evaluating and optimizing the useful anomalous correlation and the anomalous signal in a SAD experiment are described. A simple theoretical framework [Terwilliger et al. (2016), Acta Cryst. D72, 346-358] is used to develop methods for planning a SAD experiment, scaling SAD data sets and estimating the useful anomalous correlation and anomalous signal in a SAD data set. The phenix.plan_sad_experiment tool uses a database of solved and unsolved SAD data sets and the expected characteristics of a SAD data set to estimate the probability that the anomalous substructure will be found in the SAD experiment and the expected map quality that would be obtained if the substructure were found. The phenix.scale_and_merge tool scales unmerged SAD data from one or more crystals using local scaling and optimizes the anomalous signal by identifying the systematic differences among data sets, and the phenix.anomalous_signal tool estimates the useful anomalous correlation and anomalous signal after collecting SAD data and estimates the probability that the data set can be solved and the likely figure of merit of phasing.

  15. Application of pattern recognition techniques to acousto-ultrasonic testing of Kevlar composite panels

    NASA Astrophysics Data System (ADS)

    Hinton, Yolanda L.

    An acousto-ultrasonic evaluation of panels fabricated from woven Kevlar and PVB/phenolic resin is being conducted. The panels were fabricated with various simulated defects. They were examined by pulsing with one acoustic emission sensor, and detecting the signal with another sensor, on the same side of the panel at a fixed distance. The acoustic emission signals were filtered through high (400-600 KHz), low (100-300 KHz) and wide (100-1200 KHz) bandpass filters. Acoustic emission signal parameters, including amplitude, counts, rise time, duration, 'energy', rms, and counts to peak, were recorded. These were statistically analyzed to determine which of the AE parameters best characterize the simulated defects. The wideband filtered acoustic emission signal was also digitized and recorded for further processing. Seventy-one features of the signals in both the time and frequency domains were calculated and compared to determine which subset of these features uniquely characterize the defects in the panels. The objective of the program is to develop a database of AE signal parameters and features to be used in pattern recognition as an inspection tool for material fabricated from these materials.

  16. A fast algorithm for vertex-frequency representations of signals on graphs

    PubMed Central

    Jestrović, Iva; Coyle, James L.; Sejdić, Ervin

    2016-01-01

    The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645

  17. The Evolutionary Ecology of Plant Disease: A Phylogenetic Perspective.

    PubMed

    Gilbert, Gregory S; Parker, Ingrid M

    2016-08-04

    An explicit phylogenetic perspective provides useful tools for phytopathology and plant disease ecology because the traits of both plants and microbes are shaped by their evolutionary histories. We present brief primers on phylogenetic signal and the analytical tools of phylogenetic ecology. We review the literature and find abundant evidence of phylogenetic signal in pathogens and plants for most traits involved in disease interactions. Plant nonhost resistance mechanisms and pathogen housekeeping functions are conserved at deeper phylogenetic levels, whereas molecular traits associated with rapid coevolutionary dynamics are more labile at branch tips. Horizontal gene transfer disrupts the phylogenetic signal for some microbial traits. Emergent traits, such as host range and disease severity, show clear phylogenetic signals. Therefore pathogen spread and disease impact are influenced by the phylogenetic structure of host assemblages. Phylogenetically rare species escape disease pressure. Phylogenetic tools could be used to develop predictive tools for phytosanitary risk analysis and reduce disease pressure in multispecies cropping systems.

  18. A functional glycoprotein competitive recognition and signal amplification strategy for carbohydrate-protein interaction profiling and cell surface carbohydrate expression evaluation

    NASA Astrophysics Data System (ADS)

    Wang, Yangzhong; Chen, Zhuhai; Liu, Yang; Li, Jinghong

    2013-07-01

    A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and infections. In this work, a sensitive biosensor for carbohydrate-lectin profiling and in situ cell surface carbohydrate expression was designed by taking advantage of a functional glycoprotein of glucose oxidase acting as both a multivalent recognition unit and a signal amplification probe. Combining the gold nanoparticle catalyzed luminol electrogenerated chemiluminescence and nanocarrier for active biomolecules, the number of cell surface carbohydrate groups could be conveniently read out. The apparent dissociation constant between GOx@Au probes and Con A was detected to be 1.64 nM and was approximately 5 orders of magnitude smaller than that of mannose and Con A, which would arise from the multivalent effect between the probe and Con A. Both glycoproteins and gold nanoparticles contribute to the high affinity between carbohydrates and lectin. The as-proposed biosensor exhibits excellent analytical performance towards the cytosensing of K562 cells with a detection limit of 18 cells, and the mannose moieties on a single K562 cell were determined to be 1.8 × 1010. The biosensor can also act as a useful tool for antibacterial drug screening and mechanism investigation. This strategy integrates the excellent biocompatibility and multivalent recognition of glycoproteins as well as the significant enzymatic catalysis and gold nanoparticle signal amplification, and avoids the cell pretreatment and labelling process. This would contribute to the glycomic analysis and the understanding of complex native glycan-related biological processes.A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and infections. In this work, a sensitive biosensor for carbohydrate-lectin profiling and in situ cell surface carbohydrate expression was designed by taking advantage of a functional glycoprotein of glucose oxidase acting as both a multivalent recognition unit and a signal amplification probe. Combining the gold nanoparticle catalyzed luminol electrogenerated chemiluminescence and nanocarrier for active biomolecules, the number of cell surface carbohydrate groups could be conveniently read out. The apparent dissociation constant between GOx@Au probes and Con A was detected to be 1.64 nM and was approximately 5 orders of magnitude smaller than that of mannose and Con A, which would arise from the multivalent effect between the probe and Con A. Both glycoproteins and gold nanoparticles contribute to the high affinity between carbohydrates and lectin. The as-proposed biosensor exhibits excellent analytical performance towards the cytosensing of K562 cells with a detection limit of 18 cells, and the mannose moieties on a single K562 cell were determined to be 1.8 × 1010. The biosensor can also act as a useful tool for antibacterial drug screening and mechanism investigation. This strategy integrates the excellent biocompatibility and multivalent recognition of glycoproteins as well as the significant enzymatic catalysis and gold nanoparticle signal amplification, and avoids the cell pretreatment and labelling process. This would contribute to the glycomic analysis and the understanding of complex native glycan-related biological processes. Electronic supplementary information (ESI) available: Experimental details; characterization of probes; the influence of electrolyte pH; probe concentration and glucose concentration on the electrode ECL effect. See DOI: 10.1039/c3nr01598j

  19. A lithospheric magnetic field model derived from the Swarm satellite magnetic field measurements

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Thebault, E.; Vigneron, P.

    2015-12-01

    The Swarm constellation of satellites was launched in November 2013 and has since then delivered high quality scalar and vector magnetic field measurements. A consortium of several research institutions was selected by the European Space Agency (ESA) to provide a number of scientific products which will be made available to the scientific community. Within this framework, specific tools were tailor-made to better extract the magnetic signal emanating from Earth's the lithospheric. These tools rely on the scalar gradient measured by the lower pair of Swarm satellites and rely on a regional modeling scheme that is more sensitive to small spatial scales and weak signals than the standard spherical harmonic modeling. In this presentation, we report on various activities related to data analysis and processing. We assess the efficiency of this dedicated chain for modeling the lithospheric magnetic field using more than one year of measurements, and finally discuss refinements that are continuously implemented in order to further improve the robustness and the spatial resolution of the lithospheric field model.

  20. Missile signal processing common computer architecture for rapid technology upgrade

    NASA Astrophysics Data System (ADS)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.

  1. Autoresonant control of nonlinear mode in ultrasonic transducer for machining applications.

    PubMed

    Babitsky, V I; Astashev, V K; Kalashnikov, A N

    2004-04-01

    Experiments conducted in several countries have shown that the improvement of machining quality can be promoted through conversion of the cutting process into one involving controllable high-frequency vibration at the cutting zone. This is achieved through the generation and maintenance of ultrasonic vibration of the cutting tool to alter the fracture process of work-piece material cutting to one in which loading of the materials at the tool tip is incremental, repetitive and controlled. It was shown that excitation of the high-frequency vibro-impact mode of the tool-workpiece interaction is the most effective way of ultrasonic influence on the dynamic characteristics of machining. The exploitation of this nonlinear mode needs a new method of adaptive control for excitation and stabilisation of ultrasonic vibration known as autoresonance. An approach has been developed to design an autoresonant ultrasonic cutting unit as an oscillating system with an intelligent electronic feedback controlling self-excitation in the entire mechatronic system. The feedback produces the exciting force by means of transformation and amplification of the motion signal. This allows realisation for robust control of fine resonant tuning to bring the nonlinear high Q-factor systems into technological application. The autoresonant control provides the possibility of self-tuning and self-adaptation mechanisms for the system to keep the nonlinear resonant mode of oscillation under unpredictable variation of load, structure and parameters. This allows simple regulation of intensity of the process whilst keeping maximum efficiency at all times. An autoresonant system with supervisory computer control was developed, tested and used for the control of the piezoelectric transducer during ultrasonically assisted cutting. The system has been developed as combined analog-digital, where analog devices process the control signal, and parameters of the devices are controlled digitally by computer. The system was applied for advanced machining of aviation materials.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays;more » the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.« less

  3. Signal connection for a downhole tool string

    DOEpatents

    Hall, David R.; Hall, Jr., H. Tracy; Pixton, David S.; Bradford, Kline; Fox, Joe; Briscoe, Michael

    2006-08-29

    A signal transmission connection for a tool string used in exploration and production of natural resources, namely: oil, gas, and geothermal energy resources. The connection comprises first and second annular elements deployed in cooperative association with each other. The respective elements comprise inductive transducers that are capable of two-way signal transmission between each other, with downhole components of the tool string, and with ground-level equipment. The respective inductive transducers comprise one or more conductive loops housed within ferrite troughs, or within ferrite trough segments. When energized, the conductive loops produce a magnetic field suitable for transmitting the signal. The second element may be rotational in drilling applications. The respective elements may be fitted with electronic equipment to aid and manipulate the transmission of the signal. The first element may also be in communication with the World Wide Web.

  4. Quokka: a comprehensive tool for rapid and accurate prediction of kinase family-specific phosphorylation sites in the human proteome.

    PubMed

    Li, Fuyi; Li, Chen; Marquez-Lago, Tatiana T; Leier, André; Akutsu, Tatsuya; Purcell, Anthony W; Smith, A Ian; Lithgow, Trevor; Daly, Roger J; Song, Jiangning; Chou, Kuo-Chen

    2018-06-27

    Kinase-regulated phosphorylation is a ubiquitous type of post-translational modification (PTM) in both eukaryotic and prokaryotic cells. Phosphorylation plays fundamental roles in many signalling pathways and biological processes, such as protein degradation and protein-protein interactions. Experimental studies have revealed that signalling defects caused by aberrant phosphorylation are highly associated with a variety of human diseases, especially cancers. In light of this, a number of computational methods aiming to accurately predict protein kinase family-specific or kinase-specific phosphorylation sites have been established, thereby facilitating phosphoproteomic data analysis. In this work, we present Quokka, a novel bioinformatics tool that allows users to rapidly and accurately identify human kinase family-regulated phosphorylation sites. Quokka was developed by using a variety of sequence scoring functions combined with an optimized logistic regression algorithm. We evaluated Quokka based on well-prepared up-to-date benchmark and independent test datasets, curated from the Phospho.ELM and UniProt databases, respectively. The independent test demonstrates that Quokka improves the prediction performance compared with state-of-the-art computational tools for phosphorylation prediction. In summary, our tool provides users with high-quality predicted human phosphorylation sites for hypothesis generation and biological validation. The Quokka webserver and datasets are freely available at http://quokka.erc.monash.edu/. Supplementary data are available at Bioinformatics online.

  5. Optimization of Surface Roughness Parameters of Al-6351 Alloy in EDC Process: A Taguchi Coupled Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Kar, Siddhartha; Chakraborty, Sujoy; Dey, Vidyut; Ghosh, Subrata Kumar

    2017-10-01

    This paper investigates the application of Taguchi method with fuzzy logic for multi objective optimization of roughness parameters in electro discharge coating process of Al-6351 alloy with powder metallurgical compacted SiC/Cu tool. A Taguchi L16 orthogonal array was employed to investigate the roughness parameters by varying tool parameters like composition and compaction load and electro discharge machining parameters like pulse-on time and peak current. Crucial roughness parameters like Centre line average roughness, Average maximum height of the profile and Mean spacing of local peaks of the profile were measured on the coated specimen. The signal to noise ratios were fuzzified to optimize the roughness parameters through a single comprehensive output measure (COM). Best COM obtained with lower values of compaction load, pulse-on time and current and 30:70 (SiC:Cu) composition of tool. Analysis of variance is carried out and a significant COM model is observed with peak current yielding highest contribution followed by pulse-on time, compaction load and composition. The deposited layer is characterised by X-Ray Diffraction analysis which confirmed the presence of tool materials on the work piece surface.

  6. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    PubMed

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  7. Application of Taguchi Method for Analyzing Factors Affecting the Performance of Coated Carbide Tool When Turning FCD700 in Dry Cutting Condition

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Mohd Rodzi, Mohd Nor Azmi; Zaki Nuawi, Mohd; Othman, Kamal; Rahman, Mohd. Nizam Ab.; Haron, Che Hassan Che; Deros, Baba Md

    2011-01-01

    Machining is one of the most important manufacturing processes in these modern industries especially for finishing an automotive component after the primary manufacturing processes such as casting and forging. In this study the turning parameters of dry cutting environment (without air, normal air and chilled air), various cutting speed, and feed rate are evaluated using a Taguchi optimization methodology. An orthogonal array L27 (313), signal-to-noise (S/N) ratio and analysis of variance (ANOVA) are employed to analyze the effect of these turning parameters on the performance of a coated carbide tool. The results show that the tool life is affected by the cutting speed, feed rate and cutting environment with contribution of 38%, 32% and 27% respectively. Whereas for the surface roughness, the feed rate is significantly controlled the machined surface produced by 77%, followed by the cutting environment of 19%. The cutting speed is found insignificant in controlling the machined surface produced. The study shows that the dry cutting environment factor should be considered in order to produce longer tool life as well as for obtaining a good machined surface.

  8. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  9. Chemiresistive and Gravimetric Dual-Mode Gas Sensor toward Target Recognition and Differentiation.

    PubMed

    Chen, Yan; Zhang, Hao; Feng, Zhihong; Zhang, Hongxiang; Zhang, Rui; Yu, Yuanyuan; Tao, Jin; Zhao, Hongyuan; Guo, Wenlan; Pang, Wei; Duan, Xuexin; Liu, Jing; Zhang, Daihua

    2016-08-24

    We demonstrate a dual-mode gas sensor for simultaneous and independent acquisition of electrical and mechanical signals from the same gas adsorption event. The device integrates a graphene field-effect transistor (FET) with a piezoelectric resonator in a seamless manner by leveraging multiple structural and functional synergies. Dual signals resulting from independent physical processes, i.e., mass attachment and charge transfer can reflect intrinsic properties of gas molecules and potentially enable target recognition and quantification at the same time. Fabrication of the device is based on standard Integrated Circuit (IC) foundry processes and fully compatible with system-on-a-chip (SoC) integration to achieve extremely small form factors. In addition, the ability of simultaneous measurements of mass adsorption and charge transfer guides us to a more precise understanding of the interactions between graphene and various gas molecules. Besides its practical functions, the device serves as an effective tool to quantitatively investigate the physical processes and sensing mechanisms for a large library of sensing materials and target analytes.

  10. The Caenorhabditis elegans Q neuroblasts: A powerful system to study cell migration at single-cell resolution in vivo.

    PubMed

    Rella, Lorenzo; Fernandes Póvoa, Euclides E; Korswagen, Hendrik C

    2016-04-01

    During development, cell migration plays a central role in the formation of tissues and organs. Understanding the molecular mechanisms that drive and control these migrations is a key challenge in developmental biology that will provide important insights into disease processes, including cancer cell metastasis. In this article, we discuss the Caenorhabditis elegans Q neuroblasts and their descendants as a tool to study cell migration at single-cell resolution in vivo. The highly stereotypical migration of these cells provides a powerful system to study the dynamic cytoskeletal processes that drive migration as well as the evolutionarily conserved signaling pathways (including different Wnt signaling cascades) that guide the cells along their specific trajectories. Here, we provide an overview of what is currently known about Q neuroblast migration and highlight the live-cell imaging, genome editing, and quantitative gene expression techniques that have been developed to study this process. © 2016 Wiley Periodicals, Inc.

  11. Bridging the Gaps: the Promise of Omics Studies in Pediatric Exercise Research

    PubMed Central

    Radom-Aizik, Shlomit; Cooper, Dan M.

    2018-01-01

    In this review, we highlight promising new discoveries that may generate useful and clinically relevant insights into the mechanisms that link exercise with growth during critical periods of development. Growth in childhood and adolescence is unique among mammals, and is a dynamic process regulated by an evolution of hormonal and inflammatory mediators, age-dependent progression of gene expression, and environmentally modulated epigenetic mechanisms. Many of these same processes likely affect molecular transducers of physical activity. How the molecular signaling associated with growth is synchronized with signaling associated with exercise is poorly understood. Recent advances in “omics,” namely, genomics and epigenetics, metabolomics, and proteomics, now provide exciting approaches and tools that can be used for the first time to address this gap. A biologic definition of “healthy” exercise that links the metabolic transducers of physical activity with parallel processes that regulate growth will transform health policy and guidelines that promote optimal use of physical activity. PMID:27137166

  12. Real-time monitoring and massive inversion of source parameters of very long period seismic signals: An application to Stromboli Volcano, Italy

    USGS Publications Warehouse

    Auger, E.; D'Auria, L.; Martini, M.; Chouet, B.; Dawson, P.

    2006-01-01

    We present a comprehensive processing tool for the real-time analysis of the source mechanism of very long period (VLP) seismic data based on waveform inversions performed in the frequency domain for a point source. A search for the source providing the best-fitting solution is conducted over a three-dimensional grid of assumed source locations, in which the Green's functions associated with each point source are calculated by finite differences using the reciprocal relation between source and receiver. Tests performed on 62 nodes of a Linux cluster indicate that the waveform inversion and search for the best-fitting signal over 100,000 point sources require roughly 30 s of processing time for a 2-min-long record. The procedure is applied to post-processing of a data archive and to continuous automatic inversion of real-time data at Stromboli, providing insights into different modes of degassing at this volcano. Copyright 2006 by the American Geophysical Union.

  13. Application of higher order SVD to vibration-based system identification and damage detection

    NASA Astrophysics Data System (ADS)

    Chao, Shu-Hsien; Loh, Chin-Hsiung; Weng, Jian-Huang

    2012-04-01

    Singular value decomposition (SVD) is a powerful linear algebra tool. It is widely used in many different signal processing methods, such principal component analysis (PCA), singular spectrum analysis (SSA), frequency domain decomposition (FDD), subspace identification and stochastic subspace identification method ( SI and SSI ). In each case, the data is arranged appropriately in matrix form and SVD is used to extract the feature of the data set. In this study three different algorithms on signal processing and system identification are proposed: SSA, SSI-COV and SSI-DATA. Based on the extracted subspace and null-space from SVD of data matrix, damage detection algorithms can be developed. The proposed algorithm is used to process the shaking table test data of the 6-story steel frame. Features contained in the vibration data are extracted by the proposed method. Damage detection can then be investigated from the test data of the frame structure through subspace-based and nullspace-based damage indices.

  14. Metafusion: A breakthrough in metallurgy

    NASA Technical Reports Server (NTRS)

    Joseph, Adrian A.

    1994-01-01

    The Metafuse Process is a patented development in the field of thin film coatings utilizing cold fusion which results in a true inter-dispersion of dissimilar materials along a gradual transition gradient through a boundary of several hundred atomic layers. The process is performed at ambient temperatures and pressures requiring relatively little energy and creating little or no heat. The process permits a remarkable range of material combinations and joining of materials which are normally incompatible. Initial applications include titanium carbide into and onto the copper resistance welding electrodes and tungsten carbide onto the cutting edges of tool steel blades. The process is achieved through application of an RF signal of low power and is based on the theory of vacancy fusion.

  15. Optimization and Surface Modification of Al-6351 Alloy Using SiC-Cu Green Compact Electrode by Electro Discharge Coating Process

    NASA Astrophysics Data System (ADS)

    Chakraborty, Sujoy; Kar, Siddhartha; Dey, Vidyut; Ghosh, Subrata Kumar

    2017-06-01

    This paper introduces the surface modification of Al-6351 alloy by green compact SiC-Cu electrode using electro-discharge coating (EDC) process. A Taguchi L-16 orthogonal array is employed to investigate the process by varying tool parameters like composition and compaction load and electro-discharge machining (EDM) parameters like pulse-on time and peak current. Material deposition rate (MDR), tool wear rate (TWR) and surface roughness (SR) are measured on the coated specimens. An optimum condition is achieved by formulating overall evaluation criteria (OEC), which combines multi-objective task into a single index. The signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) is employed to investigate the effect of relevant process parameters. A confirmation test is conducted based on optimal process parameters and experimental results are provided to illustrate the effectiveness of this approach. The modified surface is characterized by optical microscope and X-ray diffraction (XRD) analysis. XRD analysis of the deposited layer confirmed the transfer of tool materials to the work surface and formation of inter-metallic phases. The micro-hardness of the resulting composite layer is also measured which is 1.5-3 times more than work material’s one and highest layer thickness (LT) of 83.644μm has been successfully achieved.

  16. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  17. Admissible Diffusion Wavelets and Their Applications in Space-Frequency Processing.

    PubMed

    Hou, Tingbo; Qin, Hong

    2013-01-01

    As signal processing tools, diffusion wavelets and biorthogonal diffusion wavelets have been propelled by recent research in mathematics. They employ diffusion as a smoothing and scaling process to empower multiscale analysis. However, their applications in graphics and visualization are overshadowed by nonadmissible wavelets and their expensive computation. In this paper, our motivation is to broaden the application scope to space-frequency processing of shape geometry and scalar fields. We propose the admissible diffusion wavelets (ADW) on meshed surfaces and point clouds. The ADW are constructed in a bottom-up manner that starts from a local operator in a high frequency, and dilates by its dyadic powers to low frequencies. By relieving the orthogonality and enforcing normalization, the wavelets are locally supported and admissible, hence facilitating data analysis and geometry processing. We define the novel rapid reconstruction, which recovers the signal from multiple bands of high frequencies and a low-frequency base in full resolution. It enables operations localized in both space and frequency by manipulating wavelet coefficients through space-frequency filters. This paper aims to build a common theoretic foundation for a host of applications, including saliency visualization, multiscale feature extraction, spectral geometry processing, etc.

  18. A MATLAB-based graphical user interface for the identification of muscular activations from surface electromyography signals.

    PubMed

    Mengarelli, Alessandro; Cardarelli, Stefano; Verdini, Federica; Burattini, Laura; Fioretti, Sandro; Di Nardo, Francesco

    2016-08-01

    In this paper a graphical user interface (GUI) built in MATLAB® environment is presented. This interactive tool has been developed for the analysis of superficial electromyography (sEMG) signals and in particular for the assessment of the muscle activation time intervals. After the signal import, the tool performs a first analysis in a totally user independent way, providing a reliable computation of the muscular activation sequences. Furthermore, the user has the opportunity to modify each parameter of the on/off identification algorithm implemented in the presented tool. The presence of an user-friendly GUI allows the immediate evaluation of the effects that the modification of every single parameter has on the activation intervals recognition, through the real-time updating and visualization of the muscular activation/deactivation sequences. The possibility to accept the initial signal analysis or to modify the on/off identification with respect to each considered signal, with a real-time visual feedback, makes this GUI-based tool a valuable instrument in clinical, research applications and also in an educational perspective.

  19. On the Hilbert-Huang Transform Theoretical Developments

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Blank, Karin; Flatley, Thomas; Huang, Norden E.; Patrick, David; Hestnes, Phyllis

    2005-01-01

    One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). Both carry strong a-priori assumptions about the source data, such as linearity, of being stationary, and of satisfying the Dirichlet conditions. A recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT), proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using a-posteriori data processing based on the Empirical Mode Decomposition (EMD) sifting process (algorithm), followed by the normalized Hilbert Transform of the decomposition data, the HHT allows spectrum analysis of nonlinear and nonstationary data. The EMD sifting process results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF). These functions form a near orthogonal adaptive basis, a basis that is derived from the data. The IMFs can be further analyzed for spectrum interpretation by the classical Hilbert Transform. A new engineering spectrum analysis tool using HHT has been developed at NASA GSFC, the HHT Data Processing System (HHT-DPS). As the HHT-DPS has been successfully used and commercialized, new applications post additional questions about the theoretical basis behind the HHT and EMD algorithms. Why is the fastest changing component of a composite signal being sifted out first in the EMD sifting process? Why does the EMD sifting process seemingly converge and why does it converge rapidly? Does an IMF have a distinctive structure? Why are the IMFs near orthogonal? We address these questions and develop the initial theoretical background for the HHT. This will contribute to the developments of new HHT processing options, such as real-time and 2-D processing using Field Programmable Array (FPGA) computational resources, enhanced HHT synthesis, and broaden the scope of HHT applications for signal processing.

  20. Reliability improvement methods for sapphire fiber temperature sensors

    NASA Astrophysics Data System (ADS)

    Schietinger, C.; Adams, B.

    1991-08-01

    Mechanical, optical, electrical, and software design improvements can be brought to bear in the enhancement of fiber-optic sapphire-fiber temperature measurement tool reliability in harsh environments. The optical fiber thermometry (OFT) equipment discussed is used in numerous process industries and generally involves a sapphire sensor, an optical transmission cable, and a microprocessor-based signal analyzer. OFT technology incorporating sensors for corrosive environments, hybrid sensors, and two-wavelength measurements, are discussed.

  1. Self-Defense Distributed Engagement Coordinator

    DTIC Science & Technology

    2016-02-01

    its countermeasures. Whether a missile is defeated with an interceptor, undermined by a signal jammer, or diverted by a decoy, there is Self - Defense ...anti-ship threats and recommends actions to the personnel coordinating ship self - defense . This tool was recognized with a 2015 R&D 100 Award. a cost...reloading process, and may not be possible at all. The Self - Defense Distributed Engagement Coordinator (SDDEC) is designed to provide automated battle

  2. Quantum Tomography Protocols with Positivity are Compressed Sensing Protocols (Open Access)

    DTIC Science & Technology

    2015-12-08

    ARTICLE OPEN Quantum tomography protocols with positivity are compressed sensing protocols Amir Kalev1, Robert L Kosut2 and Ivan H Deutsch1...Characterising complex quantum systems is a vital task in quantum information science. Quantum tomography, the standard tool used for this purpose, uses a well...designed measurement record to reconstruct quantum states and processes. It is, however, notoriously inefficient. Recently, the classical signal

  3. EARLINET Single Calculus Chain - technical - Part 2: Calculation of optical products

    NASA Astrophysics Data System (ADS)

    Mattis, Ina; D'Amico, Giuseppe; Baars, Holger; Amodeo, Aldo; Madonna, Fabio; Iarlori, Marco

    2016-07-01

    In this paper we present the automated software tool ELDA (EARLINET Lidar Data Analyzer) for the retrieval of profiles of optical particle properties from lidar signals. This tool is one of the calculus modules of the EARLINET Single Calculus Chain (SCC) which allows for the analysis of the data of many different lidar systems of EARLINET in an automated, unsupervised way. ELDA delivers profiles of particle extinction coefficients from Raman signals as well as profiles of particle backscatter coefficients from combinations of Raman and elastic signals or from elastic signals only. Those analyses start from pre-processed signals which have already been corrected for background, range dependency and hardware specific effects. An expert group reviewed all algorithms and solutions for critical calculus subsystems which are used within EARLINET with respect to their applicability for automated retrievals. Those methods have been implemented in ELDA. Since the software was designed in a modular way, it is possible to add new or alternative methods in future. Most of the implemented algorithms are well known and well documented, but some methods have especially been developed for ELDA, e.g., automated vertical smoothing and temporal averaging or the handling of effective vertical resolution in the case of lidar ratio retrievals, or the merging of near-range and far-range products. The accuracy of the retrieved profiles was tested following the procedure of the EARLINET-ASOS algorithm inter-comparison exercise which is based on the analysis of synthetic signals. Mean deviations, mean relative deviations, and normalized root-mean-square deviations were calculated for all possible products and three height layers. In all cases, the deviations were clearly below the maximum allowed values according to the EARLINET quality requirements.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Bibi Rafeiza; Faure, Lionel; Chapman, Kent D.

    N-Acylethanolamines (NAEs) are a group of fatty acid amides that play signaling roles in diverse physiological processes in eukaryotes. We used fatty acid amide hydrolase (FAAH) degrades NAE into ethanolamine and free fatty acid to terminate its signaling function. In animals, chemical inhibitors of FAAH for therapeutic treatment of pain and as tools to probe deeper into biochemical properties of FAAH. In a chemical genetic screen for small molecules that dampened the inhibitory effect of N-lauroylethanolamine (NAE 12:0) on Arabidopsis thaliana seedling growth, we identified 6-(2-methoxyphenyl)-1,3-dimethyl-5-phenyl-1H-pyrrolo[3,4-d]pyrimidine-2,4(3 H,6 H)-dione (or MDPD). MDPD alleviated the growth inhibitory effects of NAE 12:0, inmore » part by enhancing the enzymatic activity of Arabidopsis FAAH (AtFAAH). In vitro, biochemical assays showed that MDPD enhanced the apparent Vmax of AtFAAH but did not alter the affinity of AtFAAH for its NAE substrates. Furthermore, structural analogs of MDPD did not affect AtFAAH activity or dampen the inhibitory effect of NAE 12:0 on seedling growth indicating that MDPD is a specific synthetic chemical activator of AtFAAH. Our study demonstrates the feasibility of using an unbiased chemical genetic approach to identify new pharmacological tools for manipulating FAAH- and NAE-mediated physiological processes in plants.« less

  5. Assessment and Calibration of a Crimp Tool Equipped with Ultrasonic Analysis Features

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Perey, Daniel F. (Inventor); Cramer, K. Elliott (Inventor)

    2013-01-01

    A method is provided for calibrating ultrasonic signals passed through a crimp formed with respect to a deformable body via an ultrasonically-equipped crimp tool (UECT). The UECT verifies a crimp quality using the ultrasonic signals. The method includes forming the crimp, transmitting a first signal, e.g., a pulse, to a first transducer of the UECT, and converting the first signal, using the first transducer, into a second signal which defines an ultrasonic pulse. This pulse is transmitted through the UECT into the crimp. A second transducer converts the second signal into a third signal, which may be further conditioned, and the ultrasonic signals are calibrated using the third signal or its conditioned variant. An apparatus for calibrating the ultrasonic signals includes a pulse module (PM) electrically connected to the first and second transducers, and an oscilloscope or display electrically connected to the PM for analyzing an electrical output signal therefrom.

  6. Paper-based microreactor array for rapid screening of cell signaling cascades.

    PubMed

    Huang, Chia-Hao; Lei, Kin Fong; Tsang, Ngan-Ming

    2016-08-07

    Investigation of cell signaling pathways is important for the study of pathogenesis of cancer. However, the related operations used in these studies are time consuming and labor intensive. Thus, the development of effective therapeutic strategies may be hampered. In this work, gel-free cell culture and subsequent immunoassay has been successfully integrated and conducted in a paper-based microreactor array. Study of the activation level of different kinases of cells stimulated by different conditions, i.e., IL-6 stimulation, starvation, and hypoxia, was demonstrated. Moreover, rapid screening of cell signaling cascades after the stimulations of HGF, doxorubicin, and UVB irradiation was respectively conducted to simultaneously screen 40 kinases and transcription factors. Activation of multi-signaling pathways could be identified and the correlation between signaling pathways was discussed to provide further information to investigate the entire signaling network. The present technique integrates most of the tedious operations using a single paper substrate, reduces sample and reagent consumption, and shortens the time required by the entire process. Therefore, it provides a first-tier rapid screening tool for the study of complicated signaling cascades. It is expected that the technique can be developed for routine protocol in conventional biological research laboratories.

  7. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  8. Imaging dynamic redox processes with genetically encoded probes.

    PubMed

    Ezeriņa, Daria; Morgan, Bruce; Dick, Tobias P

    2014-08-01

    Redox signalling plays an important role in many aspects of physiology, including that of the cardiovascular system. Perturbed redox regulation has been associated with numerous pathological conditions; nevertheless, the causal relationships between redox changes and pathology often remain unclear. Redox signalling involves the production of specific redox species at specific times in specific locations. However, until recently, the study of these processes has been impeded by a lack of appropriate tools and methodologies that afford the necessary redox species specificity and spatiotemporal resolution. Recently developed genetically encoded fluorescent redox probes now allow dynamic real-time measurements, of defined redox species, with subcellular compartment resolution, in intact living cells. Here we discuss the available genetically encoded redox probes in terms of their sensitivity and specificity and highlight where uncertainties or controversies currently exist. Furthermore, we outline major goals for future probe development and describe how progress in imaging methodologies will improve our ability to employ genetically encoded redox probes in a wide range of situations. This article is part of a special issue entitled "Redox Signalling in the Cardiovascular System." Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Alpha-band rhythm modulation under the condition of subliminal face presentation: MEG study.

    PubMed

    Sakuraba, Satoshi; Kobayashi, Hana; Sakai, Shinya; Yokosawa, Koichi

    2013-01-01

    The human brain has two streams to process visual information: a dorsal stream and a ventral stream. Negative potential N170 or its magnetic counterpart M170 is known as the face-specific signal originating from the ventral stream. It is possible to present a visual image unconsciously by using continuous flash suppression (CFS), which is a visual masking technique adopting binocular rivalry. In this work, magnetoencephalograms were recorded during presentation of the three invisible images: face images, which are processed by the ventral stream; tool images, which could be processed by the dorsal stream, and a blank image. Alpha-band activities detected by sensors that are sensitive to M170 were compared. The alpha-band rhythm was suppressed more during presentation of face images than during presentation of the blank image (p=.028). The suppression remained for about 1 s after ending presentations. However, no significant difference was observed between tool and other images. These results suggest that alpha-band rhythm can be modulated also by unconscious visual images.

  10. Origins of Highly Stable Al-evaporated Solution-processed ZnO Thin Film Transistors: Insights from Low Frequency and Random Telegraph Signal Noise

    NASA Astrophysics Data System (ADS)

    Kim, Joo Hyung; Kang, Tae Sung; Yang, Jung Yup; Hong, Jin Pyo

    2015-11-01

    One long-standing goal in the emerging field of flexible and transparent electronic devices is to meet the demand of key markets, such as enhanced output performance for metal oxide semiconductor thin film transistors (TFTs) prepared by a solution process. While solution-based fabrication techniques are cost-effective and ensure large-area coverage at low temperature, their utilization has the disadvantage of introducing large trap states into TFTs. Such states, the formation of which is induced by intrinsic defects initially produced during preparation, have a significant impact on electrical performance. Therefore, the ability to enhance the electrical characteristics of solution-processed TFTs, along with attaining a firm understanding of their physical nature, remains a key step towards extending their use. In this study, measurements of low-frequency noise and random telegraph signal noise are employed as generic alternative tools to examine the origins of enhanced output performance for solution-processed ZnO TFTs through the control of defect sites by Al evaporation.

  11. A method for acetylcholinesterase staining of brain sections previously processed for receptor autoradiography.

    PubMed

    Lim, M M; Hammock, E A D; Young, L J

    2004-02-01

    Receptor autoradiography using selective radiolabeled ligands allows visualization of brain receptor distribution and density on film. The resolution of specific brain regions on the film often can be difficult to discern owing to the general spread of the radioactive label and the lack of neuroanatomical landmarks on film. Receptor binding is a chemically harsh protocol that can render the tissue virtually unstainable by Nissl and other conventional stains used to delineate neuroanatomical boundaries of brain regions. We describe a method for acetylcholinesterase (AChE) staining of slides previously processed for receptor binding. AChE staining is a useful tool for delineating major brain nuclei and tracts. AChE staining on sections that have been processed for receptor autoradiography provides a direct comparison of brain regions for more precise neuroanatomical description. We report a detailed thiocholine protocol that is a modification of the Koelle-Friedenwald method to amplify the AChE signal in brain sections previously processed for autoradiography. We also describe several temporal and experimental factors that can affect the density and clarity of the AChE signal when using this protocol.

  12. Optical parametric amplification and oscillation assisted by low-frequency stimulated emission.

    PubMed

    Longhi, Stefano

    2016-04-15

    Optical parametric amplification and oscillation provide powerful tools for coherent light generation in spectral regions inaccessible to lasers. Parametric gain is based on a frequency down-conversion process and, thus, it cannot be realized for signal waves at a frequency ω3 higher than the frequency of the pump wave ω1. In this Letter, we suggest a route toward the realization of upconversion optical parametric amplification and oscillation, i.e., amplification of the signal wave by a coherent pump wave of lower frequency, assisted by stimulated emission of the auxiliary idler wave. When the signal field is resonated in an optical cavity, parametric oscillation is obtained. Design parameters for the observation of upconversion optical parametric oscillation at λ3=465 nm are given for a periodically poled lithium-niobate (PPLN) crystal doped with Nd(3+) ions.

  13. Selective disruption of the AKAP signaling complexes.

    PubMed

    Kennedy, Eileen J; Scott, John D

    2015-01-01

    Synthesis of the second messenger cAMP activates a variety of signaling pathways critical for all facets of intracellular regulation. Protein kinase A (PKA) is the major cAMP-responsive effector. Where and when this enzyme is activated has profound implications on the cellular role of PKA. A-Kinase Anchoring Proteins (AKAPs) play a critical role in this process by orchestrating spatial and temporal aspects of PKA action. A popular means of evaluating the impact of these anchored signaling events is to biochemically interfere with the PKA-AKAP interface. Hence, peptide disruptors of PKA anchoring are valuable tools in the investigation of local PKA action. This article outlines the development of PKA isoform-selective disruptor peptides, documents the optimization of cell-soluble peptide derivatives, and introduces alternative cell-based approaches that interrogate other aspects of the PKA-AKAP interface.

  14. Estimating accidental pollutant releases in the built environment from turbulent concentration signals

    NASA Astrophysics Data System (ADS)

    Ben Salem, N.; Salizzoni, P.; Soulhac, L.

    2017-01-01

    We present an inverse atmospheric model to estimate the mass flow rate of an impulsive source of pollutant, whose position is known, from concentration signals registered at receptors placed downwind of the source. The originality of this study is twofold. Firstly, the inversion is performed using high-frequency fluctuating, i.e. turbulent, concentration signals. Secondly, the inverse algorithm is applied to a dispersion process within a dense urban canopy, at the district scale, and a street network model, SIRANERISK, is adopted. The model, which is tested against wind tunnel experiments, simulates the dispersion of short-duration releases of pollutant in different typologies of idealised urban geometries. Results allow us to discuss the reliability of the inverse model as an operational tool for crisis management and the risk assessments related to the accidental release of toxic and flammable substances.

  15. Wavelet transform analysis of transient signals: the seismogram and the electrocardiogram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anant, K.S.

    1997-06-01

    In this dissertation I quantitatively demonstrate how the wavelet transform can be an effective mathematical tool for the analysis of transient signals. The two key signal processing applications of the wavelet transform, namely feature identification and representation (i.e., compression), are shown by solving important problems involving the seismogram and the electrocardiogram. The seismic feature identification problem involved locating in time the P and S phase arrivals. Locating these arrivals accurately (particularly the S phase) has been a constant issue in seismic signal processing. In Chapter 3, I show that the wavelet transform can be used to locate both the Pmore » as well as the S phase using only information from single station three-component seismograms. This is accomplished by using the basis function (wave-let) of the wavelet transform as a matching filter and by processing information across scales of the wavelet domain decomposition. The `pick` time results are quite promising as compared to analyst picks. The representation application involved the compression of the electrocardiogram which is a recording of the electrical activity of the heart. Compression of the electrocardiogram is an important problem in biomedical signal processing due to transmission and storage limitations. In Chapter 4, I develop an electrocardiogram compression method that applies vector quantization to the wavelet transform coefficients. The best compression results were obtained by using orthogonal wavelets, due to their ability to represent a signal efficiently. Throughout this thesis the importance of choosing wavelets based on the problem at hand is stressed. In Chapter 5, I introduce a wavelet design method that uses linear prediction in order to design wavelets that are geared to the signal or feature being analyzed. The use of these designed wavelets in a test feature identification application led to positive results. The methods developed in this thesis; the feature identification methods of Chapter 3, the compression methods of Chapter 4, as well as the wavelet design methods of Chapter 5, are general enough to be easily applied to other transient signals.« less

  16. A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression

    PubMed Central

    Medeiros, Carolina Brum; Wanderley, Marcelo M.

    2014-01-01

    Digital Musical Instruments (DMIs) are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009–2013). Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments. PMID:25068865

  17. A comprehensive review of sensors and instrumentation methods in devices for musical expression.

    PubMed

    Medeiros, Carolina Brum; Wanderley, Marcelo M

    2014-07-25

    Digital Musical Instruments (DMIs) are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009-2013). Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments.

  18. A remote and non-contact method for obtaining the blood-pulse waveform with a laser Doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Desjardins, Candida L.; Antonelli, Lynn T.; Soares, Edward

    2007-02-01

    The use of lasers to remotely and non-invasively detect the blood pressure waveform of humans and animals would provide a powerful diagnostic tool. Current blood pressure measurement tools, such as a cuff, are not useful for burn and trauma victims, and animals require catheterization to acquire accurate blood pressure information. The purpose of our sensor method and apparatus invention is to remotely and non-invasively detect the blood pulse waveform of both animals and humans. This device is used to monitor an animal or human's skin in proximity to an artery using radiation from a laser Doppler vibrometer (LDV). This system measures the velocity (or displacement) of the pulsatile motion of the skin, indicative of physiological parameters of the arterial motion in relation to the cardiac cycle. Tests have been conducted that measures surface velocity with an LDV and a signal-processing unit, with enhanced detection obtained with optional hardware including a retro-reflector dot. The blood pulse waveform is obtained by integrating the velocity signal to get surface displacement using standard signal processing techniques. Continuous recording of the blood pulse waveform yields data containing information on cardiac health and can be analyzed to identify important events in the cardiac cycle, such as heart rate, the timing of peak systole, left ventricular ejection time and aortic valve closure. Experimental results are provided that demonstrates the current capabilities of the optical, non-contact sensor for the continuous, non-contact recording of the blood pulse waveform without causing patient distress.

  19. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    PubMed Central

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  20. Role of Nonneuronal TRPV4 Signaling in Inflammatory Processes.

    PubMed

    Rajasekhar, Pradeep; Poole, Daniel P; Veldhuis, Nicholas A

    2017-01-01

    Transient receptor potential (TRP) ion channels are important signaling components in nociceptive and inflammatory pathways. This is attributed to their ability to function as polymodal sensors of environmental stimuli (chemical and mechanical) and as effector molecules in receptor signaling pathways. TRP vanilloid 4 (TRPV4) is a nonselective cation channel that is activated by multiple endogenous stimuli including shear stress, membrane stretch, and arachidonic acid metabolites. TRPV4 contributes to many important physiological processes and dysregulation of its activity is associated with chronic conditions of metabolism, inflammation, peripheral neuropathies, musculoskeletal development, and cardiovascular regulation. Mechanosensory and receptor- or lipid-mediated signaling functions of TRPV4 have historically been attributed to central and peripheral neurons. However, with the development of potent and selective pharmacological tools, transgenic mice and improved molecular and imaging techniques, many new roles for TRPV4 have been revealed in nonneuronal cells. In this chapter, we discuss these recent findings and highlight the need for greater characterization of TRPV4-mediated signaling in nonneuronal cell types that are either directly associated with neurons or indirectly control their excitability through release of sensitizing cellular factors. We address the integral role of these cells in sensory and inflammatory processes as well as their importance when considering undesirable on-target effects that may be caused by systemic delivery of TRPV4-selective pharmaceutical agents for treatment of chronic diseases. In future, this will drive a need for targeted drug delivery strategies to regulate such a diverse and promiscuous protein. © 2017 Elsevier Inc. All rights reserved.

  1. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  2. Functional Laser Trimming Of Thin Film Resistors On Silicon ICs

    NASA Astrophysics Data System (ADS)

    Mueller, Michael J.; Mickanin, Wes

    1986-07-01

    Modern Laser Wafer Trimming (LWT) technology achieves exceptional analog circuit performance and precision while maintain-ing the advantages of high production throughput and yield. Microprocessor-driven instrumentation has both emphasized the role of data conversion circuits and demanded sophisticated signal conditioning functions. Advanced analog semiconductor circuits with bandwidths over 1 GHz, and high precision, trimmable, thin-film resistors meet many of todays emerging circuit requirements. Critical to meeting these requirements are optimum choices of laser characteristics, proper materials, trimming process control, accurate modeling of trimmed resistor performance, and appropriate circuit design. Once limited exclusively to hand-crafted, custom integrated circuits, designs are now available in semi-custom circuit configurations. These are similar to those provided for digital designs and supported by computer-aided design (CAD) tools. Integrated with fully automated measurement and trimming systems, these quality circuits can now be produced in quantity to meet the requirements of communications, instrumentation, and signal processing markets.

  3. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  4. Can I solve my structure by SAD phasing? Planning an experiment, scaling data and evaluating the useful anomalous correlation and anomalous signal

    PubMed Central

    Terwilliger, Thomas C.; Bunkóczi, Gábor; Hung, Li-Wei; Zwart, Peter H.; Smith, Janet L.; Akey, David L.; Adams, Paul D.

    2016-01-01

    A key challenge in the SAD phasing method is solving a structure when the anomalous signal-to-noise ratio is low. Here, algorithms and tools for evaluating and optimizing the useful anomalous correlation and the anomalous signal in a SAD experiment are described. A simple theoretical framework [Terwilliger et al. (2016 ▸), Acta Cryst. D72, 346–358] is used to develop methods for planning a SAD experiment, scaling SAD data sets and estimating the useful anomalous correlation and anomalous signal in a SAD data set. The phenix.plan_sad_experiment tool uses a database of solved and unsolved SAD data sets and the expected characteristics of a SAD data set to estimate the probability that the anomalous substructure will be found in the SAD experiment and the expected map quality that would be obtained if the substructure were found. The phenix.scale_and_merge tool scales unmerged SAD data from one or more crystals using local scaling and optimizes the anomalous signal by identifying the systematic differences among data sets, and the phenix.anomalous_signal tool estimates the useful anomalous correlation and anomalous signal after collecting SAD data and estimates the probability that the data set can be solved and the likely figure of merit of phasing. PMID:26960123

  5. Association rule mining in the US Vaccine Adverse Event Reporting System (VAERS).

    PubMed

    Wei, Lai; Scott, John

    2015-09-01

    Spontaneous adverse event reporting systems are critical tools for monitoring the safety of licensed medical products. Commonly used signal detection algorithms identify disproportionate product-adverse event pairs and may not be sensitive to more complex potential signals. We sought to develop a computationally tractable multivariate data-mining approach to identify product-multiple adverse event associations. We describe an application of stepwise association rule mining (Step-ARM) to detect potential vaccine-symptom group associations in the US Vaccine Adverse Event Reporting System. Step-ARM identifies strong associations between one vaccine and one or more adverse events. To reduce the number of redundant association rules found by Step-ARM, we also propose a clustering method for the post-processing of association rules. In sample applications to a trivalent intradermal inactivated influenza virus vaccine and to measles, mumps, rubella, and varicella (MMRV) vaccine and in simulation studies, we find that Step-ARM can detect a variety of medically coherent potential vaccine-symptom group signals efficiently. In the MMRV example, Step-ARM appears to outperform univariate methods in detecting a known safety signal. Our approach is sensitive to potentially complex signals, which may be particularly important when monitoring novel medical countermeasure products such as pandemic influenza vaccines. The post-processing clustering algorithm improves the applicability of the approach as a screening method to identify patterns that may merit further investigation. Copyright © 2015 John Wiley & Sons, Ltd.

  6. The Role of Dafachronic Acid Signaling in Development and Longevity in Caenorhabditis elegans: Digging Deeper Using Cutting-Edge Analytical Chemistry.

    PubMed

    Aguilaniu, Hugo; Fabrizio, Paola; Witting, Michael

    2016-01-01

    Steroid hormones regulate physiological processes in species ranging from plants to humans. A wide range of steroid hormones exist, and their contributions to processes, such as growth, reproduction, development, and aging, is almost always complex. Understanding the biosynthetic pathways that generate steroid hormones and the signaling pathways that mediate their effects is thus of fundamental importance. In this work, we review recent advances in (i) the biological role of steroid hormones in the roundworm Caenorhabditis elegans and (ii) the development of novel methods to facilitate the detection and identification of these molecules. Our current understanding of steroid signaling in this simple organism serves to illustrate the challenges we face moving forward. First, it seems clear that we have not yet identified all of the enzymes responsible for steroid biosynthesis and/or degradation. Second, perturbation of steroid signaling affects a wide range of phenotypes, and subtly different steroid molecules can have distinct effects. Finally, steroid hormone levels are critically important, and minute variations in quantity can profoundly impact a phenotype. Thus, it is imperative that we develop innovative analytical tools and combine them with cutting-edge approaches including comprehensive and highly selective liquid chromatography coupled to mass spectrometry based on new methods such as supercritical fluid chromatography coupled to mass spectrometry (SFC-MS) if we are to obtain a better understanding of the biological functions of steroid signaling.

  7. Novel Tool for Complete Digitization of Paper Electrocardiography Data

    PubMed Central

    Harless, Chris; Shah, Amit J.; Wick, Carson A.; Mcclellan, James H.

    2013-01-01

    Objective: We present a Matlab-based tool to convert electrocardiography (ECG) information from paper charts into digital ECG signals. The tool can be used for long-term retrospective studies of cardiac patients to study the evolving features with prognostic value. Methods and procedures: To perform the conversion, we: 1) detect the graphical grid on ECG charts using grayscale thresholding; 2) digitize the ECG signal based on its contour using a column-wise pixel scan; and 3) use template-based optical character recognition to extract patient demographic information from the paper ECG in order to interface the data with the patients' medical record. To validate the digitization technique: 1) correlation between the digital signals and signals digitized from paper ECG are performed and 2) clinically significant ECG parameters are measured and compared from both the paper-based ECG signals and the digitized ECG. Results: The validation demonstrates a correlation value of 0.85–0.9 between the digital ECG signal and the signal digitized from the paper ECG. There is a high correlation in the clinical parameters between the ECG information from the paper charts and digitized signal, with intra-observer and inter-observer correlations of 0.8–0.9 \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$({\\rm p}<{0.05})$\\end{document}, and kappa statistics ranging from 0.85 (inter-observer) to 1.00 (intra-observer). Conclusion: The important features of the ECG signal, especially the QRST complex and the associated intervals, are preserved by obtaining the contour from the paper ECG. The differences between the measures of clinically important features extracted from the original signal and the reconstructed signal are insignificant, thus highlighting the accuracy of this technique. Clinical impact: Using this type of ECG digitization tool to carry out retrospective studies on large databases, which rely on paper ECG records, studies of emerging ECG features can be performed. In addition, this tool can be used to potentially integrate digitized ECG information with digital ECG analysis programs and with the patient's electronic medical record. PMID:26594601

  8. Adaptive Data Processing Technique for Lidar-Assisted Control to Bridge the Gap between Lidar Systems and Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlipf, David; Raach, Steffen; Haizmann, Florian

    2015-12-14

    This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less

  9. BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.

    PubMed

    Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing

    2012-03-01

    BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.

  10. Review of current GPS methodologies for producing accurate time series and their error sources

    NASA Astrophysics Data System (ADS)

    He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping

    2017-05-01

    The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.

  11. Root gravitropism: an experimental tool to investigate basic cellular and molecular processes underlying mechanosensing and signal transmission in plants

    NASA Technical Reports Server (NTRS)

    Boonsirichai, K.; Guan, C.; Chen, R.; Masson, P. H.

    2002-01-01

    The ability of plant organs to use gravity as a guide for growth, named gravitropism, has been recognized for over two centuries. This growth response to the environment contributes significantly to the upward growth of shoots and the downward growth of roots commonly observed throughout the plant kingdom. Root gravitropism has received a great deal of attention because there is a physical separation between the primary site for gravity sensing, located in the root cap, and the site of differential growth response, located in the elongation zones (EZs). Hence, this system allows identification and characterization of different phases of gravitropism, including gravity perception, signal transduction, signal transmission, and curvature response. Recent studies support some aspects of an old model for gravity sensing, which postulates that root-cap columellar amyloplasts constitute the susceptors for gravity perception. Such studies have also allowed the identification of several molecules that appear to function as second messengers in gravity signal transduction and of potential signal transducers. Auxin has been implicated as a probable component of the signal that carries the gravitropic information between the gravity-sensing cap and the gravity-responding EZs. This has allowed the identification and characterization of important molecular processes underlying auxin transport and response in plants. New molecular models can be elaborated to explain how the gravity signal transduction pathway might regulate the polarity of auxin transport in roots. Further studies are required to test these models, as well as to study the molecular mechanisms underlying a poorly characterized phase of gravitropism that is independent of an auxin gradient.

  12. Measuring the RC time constant with Arduino

    NASA Astrophysics Data System (ADS)

    Pereira, N. S. A.

    2016-11-01

    In this work we use the Arduino UNO R3 open source hardware platform to assemble an experimental apparatus for the measurement of the time constant of an RC circuit. With adequate programming, the Arduino is used as a signal generator, a data acquisition system and a basic signal visualisation tool. Theoretical calculations are compared with direct observations from an analogue oscilloscope. Data processing and curve fitting is performed on a spreadsheet. The results obtained for the six RC test circuits are within the expected interval of values defined by the tolerance of the components. The hardware and software prove to be adequate to the proposed measurements and therefore adaptable to a laboratorial teaching and learning context.

  13. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  14. Development of flank wear model of cutting tool by using adaptive feedback linear control system on machining AISI D2 steel and AISI 4340 steel

    NASA Astrophysics Data System (ADS)

    Orra, Kashfull; Choudhury, Sounak K.

    2016-12-01

    The purpose of this paper is to build an adaptive feedback linear control system to check the variation of cutting force signal to improve the tool life. The paper discusses the use of transfer function approach in improving the mathematical modelling and adaptively controlling the process dynamics of the turning operation. The experimental results shows to be in agreement with the simulation model and error obtained is less than 3%. The state space approach model used in this paper successfully check the adequacy of the control system through controllability and observability test matrix and can be transferred from one state to another by appropriate input control in a finite time. The proposed system can be implemented to other machining process under varying range of cutting conditions to improve the efficiency and observability of the system.

  15. Postnatal Ablation of Synaptic Retinoic Acid Signaling Impairs Cortical Information Processing and Sensory Discrimination in Mice.

    PubMed

    Park, Esther; Tjia, Michelle; Zuo, Yi; Chen, Lu

    2018-06-06

    Retinoic acid (RA) and its receptors (RARs) are well established essential transcriptional regulators during embryonic development. Recent findings in cultured neurons identified an independent and critical post-transcriptional role of RA and RARα in the homeostatic regulation of excitatory and inhibitory synaptic transmission in mature neurons. However, the functional relevance of synaptic RA signaling in vivo has not been established. Here, using somatosensory cortex as a model system and the RARα conditional knock-out mouse as a tool, we applied multiple genetic manipulations to delete RARα postnatally in specific populations of cortical neurons, and asked whether synaptic RA signaling observed in cultured neurons is involved in cortical information processing in vivo Indeed, conditional ablation of RARα in mice via a CaMKIIα-Cre or a layer 5-Cre driver line or via somatosensory cortex-specific viral expression of Cre-recombinase impaired whisker-dependent texture discrimination, suggesting a critical requirement of RARα expression in L5 pyramidal neurons of somatosensory cortex for normal tactile sensory processing. Transcranial two-photon imaging revealed a significant increase in dendritic spine elimination on apical dendrites of somatosensory cortical layer 5 pyramidal neurons in these mice. Interestingly, the enhancement of spine elimination is whisker experience-dependent as whisker trimming rescued the spine elimination phenotype. Additionally, experiencing an enriched environment improved texture discrimination in RARα-deficient mice and reduced excessive spine pruning. Thus, RA signaling is essential for normal experience-dependent cortical circuit remodeling and sensory processing. SIGNIFICANCE STATEMENT The importance of synaptic RA signaling has been demonstrated in in vitro studies. However, whether RA signaling mediated by RARα contributes to neural circuit functions in vivo remains largely unknown. In this study, using a RARα conditional knock-out mouse, we performed multiple regional/cell-type-specific manipulation of RARα expression in the postnatal brain, and show that RARα signaling contributes to normal whisker-dependent texture discrimination as well as regulating spine dynamics of apical dendrites from layer (L5) pyramidal neurons in S1. Deletion of RARα in excitatory neurons in the forebrain induces elevated spine elimination and impaired sensory discrimination. Our study provides novel insights into the role of RARα signaling in cortical processing and experience-dependent spine maturation. Copyright © 2018 the authors 0270-6474/18/385277-12$15.00/0.

  16. Fpga based L-band pulse doppler radar design and implementation

    NASA Astrophysics Data System (ADS)

    Savci, Kubilay

    As its name implies RADAR (Radio Detection and Ranging) is an electromagnetic sensor used for detection and locating targets from their return signals. Radar systems propagate electromagnetic energy, from the antenna which is in part intercepted by an object. Objects reradiate a portion of energy which is captured by the radar receiver. The received signal is then processed for information extraction. Radar systems are widely used for surveillance, air security, navigation, weather hazard detection, as well as remote sensing applications. In this work, an FPGA based L-band Pulse Doppler radar prototype, which is used for target detection, localization and velocity calculation has been built and a general-purpose Pulse Doppler radar processor has been developed. This radar is a ground based stationary monopulse radar, which transmits a short pulse with a certain pulse repetition frequency (PRF). Return signals from the target are processed and information about their location and velocity is extracted. Discrete components are used for the transmitter and receiver chain. The hardware solution is based on Xilinx Virtex-6 ML605 FPGA board, responsible for the control of the radar system and the digital signal processing of the received signal, which involves Constant False Alarm Rate (CFAR) detection and Pulse Doppler processing. The algorithm is implemented in MATLAB/SIMULINK using the Xilinx System Generator for DSP tool. The field programmable gate arrays (FPGA) implementation of the radar system provides the flexibility of changing parameters such as the PRF and pulse length therefore it can be used with different radar configurations as well. A VHDL design has been developed for 1Gbit Ethernet connection to transfer digitized return signal and detection results to PC. An A-Scope software has been developed with C# programming language to display time domain radar signals and detection results on PC. Data are processed both in FPGA chip and on PC. FPGA uses fixed point arithmetic operations as it is fast and facilitates source requirement as it consumes less hardware than floating point arithmetic operations. The software uses floating point arithmetic operations, which ensure precision in processing at the expense of speed. The functionality of the radar system has been tested for experimental validation in the field with a moving car and the validation of submodules are tested with synthetic data simulated on MATLAB.

  17. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/. PMID:27375472

  18. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/.

  19. Natural Resources for Optogenetic Tools.

    PubMed

    Mathes, Tilo

    2016-01-01

    Photoreceptors are found in all kingdoms of life and mediate crucial responses to environmental challenges. Nature has evolved various types of photoresponsive protein structures with different chromophores and signaling concepts for their given purpose. The abundance of these signaling proteins as found nowadays by (meta-)genomic screens enriched the palette of optogenetic tools significantly. In addition, molecular insights into signal transduction mechanisms and design principles from biophysical studies and from structural and mechanistic comparison of homologous proteins opened seemingly unlimited possibilities for customizing the naturally occurring proteins for a given optogenetic task. Here, a brief overview on the photoreceptor concepts already established as optogenetic tools in natural or engineered form, their photochemistry and their signaling/design principles is given. Finally, so far not regarded photosensitive modules and protein architectures with potential for optogenetic application are described.

  20. WebBioBank: a new platform for integrating clinical forms and shared neurosignal analyses to support multi-centre studies in Parkinson's Disease.

    PubMed

    Rossi, Elena; Rosa, Manuela; Rossi, Lorenzo; Priori, Alberto; Marceglia, Sara

    2014-12-01

    The web-based systems available for multi-centre clinical trials do not combine clinical data collection (Electronic Health Records, EHRs) with signal processing storage and analysis tools. However, in pathophysiological research, the correlation between clinical data and signals is crucial for uncovering the underlying neurophysiological mechanisms. A specific example is the investigation of the mechanisms of action for Deep Brain Stimulation (DBS) used for Parkinson's Disease (PD); the neurosignals recorded from the DBS target structure and clinical data must be investigated. The aim of this study is the development and testing of a new system dedicated to a multi-centre study of Parkinson's Disease that integrates biosignal analysis tools and data collection in a shared and secure environment. We designed a web-based platform (WebBioBank) for managing the clinical data and biosignals of PD patients treated with DBS in different clinical research centres. Homogeneous data collection was ensured in the different centres (Operative Units, OUs). The anonymity of the data was preserved using unique identifiers associated with patients (ID BAC). The patients' personal details and their equivalent ID BACs were archived inside the corresponding OU and were not uploaded on the web-based platform; data sharing occurred using the ID BACs. The system allowed researchers to upload different signal processing functions (in a .dll extension) onto the web-based platform and to combine them to define dedicated algorithms. Four clinical research centres used WebBioBank for 1year. The clinical data from 58 patients treated using DBS were managed, and 186 biosignals were uploaded and classified into 4 categories based on the treatment (pharmacological and/or electrical). The user's satisfaction mean score exceeded the satisfaction threshold. WebBioBank enabled anonymous data sharing for a clinical study conducted at multiple centres and demonstrated the capabilities of the signal processing chain configuration as well as its effectiveness and efficiency for integrating the neurophysiological results with clinical data in multi-centre studies, which will allow the future collection of homogeneous data in large cohorts of patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  2. Quantitative Förster resonance energy transfer analysis for kinetic determinations of SUMO-specific protease.

    PubMed

    Liu, Yan; Song, Yang; Madahar, Vipul; Liao, Jiayu

    2012-03-01

    Förster resonance energy transfer (FRET) technology has been widely used in biological and biomedical research, and it is a very powerful tool for elucidating protein interactions in either dynamic or steady state. SUMOylation (the process of SUMO [small ubiquitin-like modifier] conjugation to substrates) is an important posttranslational protein modification with critical roles in multiple biological processes. Conjugating SUMO to substrates requires an enzymatic cascade. Sentrin/SUMO-specific proteases (SENPs) act as an endopeptidase to process the pre-SUMO or as an isopeptidase to deconjugate SUMO from its substrate. To fully understand the roles of SENPs in the SUMOylation cycle, it is critical to understand their kinetics. Here, we report a novel development of a quantitative FRET-based protease assay for SENP1 kinetic parameter determination. The assay is based on the quantitative analysis of the FRET signal from the total fluorescent signal at acceptor emission wavelength, which consists of three components: donor (CyPet-SUMO1) emission, acceptor (YPet) emission, and FRET signal during the digestion process. Subsequently, we developed novel theoretical and experimental procedures to determine the kinetic parameters, k(cat), K(M), and catalytic efficiency (k(cat)/K(M)) of catalytic domain SENP1 toward pre-SUMO1. Importantly, the general principles of this quantitative FRET-based protease kinetic determination can be applied to other proteases. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. An integrated tool for the diagnosis of voice disorders.

    PubMed

    Godino-Llorente, Juan I; Sáenz-Lechón, Nicolás; Osma-Ruiz, Víctor; Aguilera-Navarro, Santiago; Gómez-Vilda, Pedro

    2006-04-01

    A PC-based integrated aid tool has been developed for the analysis and screening of pathological voices. With it the user can simultaneously record speech, electroglottographic (EGG), and videoendoscopic signals, and synchronously edit them to select the most significant segments. These multimedia data are stored on a relational database, together with a patient's personal information, anamnesis, diagnosis, visits, explorations and any other comment the specialist may wish to include. The speech and EGG waveforms are analysed by means of temporal representations and the quantitative measurements of parameters such as spectrograms, frequency and amplitude perturbation measurements, harmonic energy, noise, etc. are calculated using digital signal processing techniques, giving an idea of the degree of hoarseness and quality of the voice register. Within this framework, the system uses a standard protocol to evaluate and build complete databases of voice disorders. The target users of this system are speech and language therapists and ear nose and throat (ENT) clinicians. The application can be easily configured to cover the needs of both groups of professionals. The software has a user-friendly Windows style interface. The PC should be equipped with standard sound and video capture cards. Signals are captured using common transducers: a microphone, an electroglottograph and a fiberscope or telelaryngoscope. The clinical usefulness of the system is addressed in a comprehensive evaluation section.

  4. Optical feedback-induced light modulation for fiber-based laser ablation.

    PubMed

    Kang, Hyun Wook

    2014-11-01

    Optical fibers have been used as a minimally invasive tool in various medical fields. However, due to excessive heat accumulation, the distal end of a fiber often suffers from severe melting or devitrification, leading to the eventual fiber failure during laser treatment. In order to minimize thermal damage at the fiber tip, an optical feedback sensor was developed and tested ex vivo. Porcine kidney tissue was used to evaluate the feasibility of optical feedback in terms of signal activation, ablation performance, and light transmission. Testing various signal thresholds demonstrated that 3 V was relatively appropriate to trigger the feedback sensor and to prevent the fiber deterioration during kidney tissue ablation. Based upon the development of temporal signal signatures, full contact mode rapidly activated the optical feedback sensor possibly due to heat accumulation. Modulated light delivery induced by optical feedback diminished ablation efficiency by 30% in comparison with no feedback case. However, long-term transmission results validated that laser ablation assisted with optical feedback was able to almost consistently sustain light delivery to the tissue as well as ablation efficiency. Therefore, an optical feedback sensor can be a feasible tool to protect optical fiber tips by minimizing debris contamination and delaying thermal damage process and to ensure more efficient and safer laser-induced tissue ablation.

  5. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    PubMed

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  6. Fringe pattern information retrieval using wavelets

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Patimo, Caterina; Manicone, Pasquale D.; Lamberti, Luciano

    2005-08-01

    Two-dimensional phase modulation is currently the basic model used in the interpretation of fringe patterns that contain displacement information, moire, holographic interferometry, speckle techniques. Another way to look to these two-dimensional signals is to consider them as frequency modulated signals. This alternative interpretation has practical implications similar to those that exist in radio engineering for handling frequency modulated signals. Utilizing this model it is possible to obtain frequency information by using the energy approach introduced by Ville in 1944. A natural complementary tool of this process is the wavelet methodology. The use of wavelet makes it possible to obtain the local values of the frequency in a one or two dimensional domain without the need of previous phase retrieval and differentiation. Furthermore from the properties of wavelets it is also possible to obtain at the same time the phase of the signal with the advantage of a better noise removal capabilities and the possibility of developing simpler algorithms for phase unwrapping due to the availability of the derivative of the phase.

  7. The Influence of Gravito-Inertial Force on Sensorimotor Integration and Reflexive Responses

    NASA Technical Reports Server (NTRS)

    Curthoys, Ian S.; Guedry, Fred E.; Merfeld, Daniel M.; Watt, Doug G. D.; Tomko, David L.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    Sensorimotor responses (e.g.. eye movements, spinal reflexes, etc depend upon the interpretation of the neural signals from the sensory systems. Since neural signals from the otoliths may represent either tilt (gravity) or translation (linear inertial force), sensory signals from the otolith organs are necessarily somewhat ambiguous. Therefore. the neural responses to changing otolith signals depend upon the context of the stimulation (e.g- active vs. passive, relative orientation of gravity, etc.) as well as upon other sensory signals (e.g., vision. canals, etc.). This session will focus upon the -role -played by the sensory signals from the otolith organs in producing efficient sensorimotor and behavioral responses. Curthoys will show the influence of the peripheral anatomy and physiology. Tomko will discuss the influence of tilt and translational otolith signals on eye movements. Merfeld will demonstrate the rate otolith organs play during the interaction of sensory signals from the canals and otoliths. Watt will show the influence of the otoliths on spinal/postural responses. Guedry will discuss the contribution of vestibular information to "path of movement"' perception and to the development of a stable vertical reference. Sensorimotor responses to the ambiguous inertial force stimulation provide an important tool to investigate how the nervous system processes patterns of sensory information and yields functional sensorimotor responses.

  8. A novel approach for the elimination of artefacts from EEG signals employing an improved Artificial Immune System algorithm

    NASA Astrophysics Data System (ADS)

    Suja Priyadharsini, S.; Edward Rajan, S.; Femilin Sheniha, S.

    2016-03-01

    Electroencephalogram (EEG) is the recording of electrical activities of the brain. It is contaminated by other biological signals, such as cardiac signal (electrocardiogram), signals generated by eye movement/eye blinks (electrooculogram) and muscular artefact signal (electromyogram), called artefacts. Optimisation is an important tool for solving many real-world problems. In the proposed work, artefact removal, based on the adaptive neuro-fuzzy inference system (ANFIS) is employed, by optimising the parameters of ANFIS. Artificial Immune System (AIS) algorithm is used to optimise the parameters of ANFIS (ANFIS-AIS). Implementation results depict that ANFIS-AIS is effective in removing artefacts from EEG signal than ANFIS. Furthermore, in the proposed work, improved AIS (IAIS) is developed by including suitable selection processes in the AIS algorithm. The performance of the proposed method IAIS is compared with AIS and with genetic algorithm (GA). Measures such as signal-to-noise ratio, mean square error (MSE) value, correlation coefficient, power spectrum density plot and convergence time are used for analysing the performance of the proposed method. From the results, it is found that the IAIS algorithm converges faster than the AIS and performs better than the AIS and GA. Hence, IAIS tuned ANFIS (ANFIS-IAIS) is effective in removing artefacts from EEG signals.

  9. Distributed digital signal processors for multi-body flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Gordon K. F.

    1992-01-01

    Multi-body flexible structures, such as those currently under investigation in spacecraft design, are large scale (high-order) dimensional systems. Controlling and filtering such structures is a computationally complex problem. This is particularly important when many sensors and actuators are located along the structure and need to be processed in real time. This report summarizes research activity focused on solving the signal processing (that is, information processing) issues of multi-body structures. A distributed architecture is developed in which single loop processors are employed for local filtering and control. By implementing such a philosophy with an embedded controller configuration, a supervising controller may be used to process global data and make global decisions as the local devices are processing local information. A hardware testbed, a position controller system for a servo motor, is employed to illustrate the capabilities of the embedded controller structure. Several filtering and control structures which can be modeled as rational functions can be implemented on the system developed in this research effort. Thus the results of the study provide a support tool for many Control/Structure Interaction (CSI) NASA testbeds such as the Evolutionary model and the nine-bay truss structure.

  10. Intersymbol Interference Investigations Using a 3D Time-Dependent Traveling Wave Tube Model

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Andro, Monty; Downey, Alan (Technical Monitor)

    2001-01-01

    For the first time, a physics based computational model has been used to provide a direct description of the effects of the TWT (Traveling Wave Tube) on modulated digital signals. The TWT model comprehensively takes into account the effects of frequency dependent AM/AM and AM/PM conversion; gain and phase ripple; drive-induced oscillations; harmonic generation; intermodulation products; and backward waves. Thus, signal integrity can be investigated in the presence of these sources of potential distortion as a function of the physical geometry of the high power amplifier and the operational digital signal. This method promises superior predictive fidelity compared to methods using TWT models based on swept amplitude and/or swept frequency data. The fully three-dimensional (3D), time-dependent, TWT interaction model using the electromagnetic code MAFIA is presented. This model is used to investigate assumptions made in TWT black box models used in communication system level simulations. In addition, digital signal performance, including intersymbol interference (ISI), is compared using direct data input into the MAFIA model and using the system level analysis tool, SPW (Signal Processing Worksystem).

  11. Cross-correlation between EMG and center of gravity during quiet stance: theory and simulations.

    PubMed

    Kohn, André Fabio

    2005-11-01

    Several signal processing tools have been employed in the experimental study of the postural control system in humans. Among them, the cross-correlation function has been used to analyze the time relationship between signals such as the electromyogram and the horizontal projection of the center of gravity. The common finding is that the electromyogram precedes the biomechanical signal, a result that has been interpreted in different ways, for example, the existence of feedforward control or the preponderance of a velocity feedback. It is shown here, analytically and by simulation, that the cross-correlation function is dependent in a complicated way on system parameters and on noise spectra. Results similar to those found experimentally, e.g., electromyogram preceding the biomechanical signal may be obtained in a postural control model without any feedforward control and without any velocity feedback. Therefore, correct interpretations of experimentally obtained cross-correlation functions may require additional information about the system. The results extend to other biomedical applications where two signals from a closed loop system are cross-correlated.

  12. Design of a portable noninvasive photoacoustic glucose monitoring system integrated laser diode excitation with annular array detection

    NASA Astrophysics Data System (ADS)

    Zeng, Lvming; Liu, Guodong; Yang, Diwu; Ren, Zhong; Huang, Zhen

    2008-12-01

    A near-infrared photoacoustic glucose monitoring system, which is integrated dual-wavelength pulsed laser diode excitation with eight-element planar annular array detection technique, is designed and fabricated during this study. It has the characteristics of nonivasive, inexpensive, portable, accurate location, and high signal-to-noise ratio. In the system, the exciting source is based on two laser diodes with wavelengths of 905 nm and 1550 nm, respectively, with optical pulse energy of 20 μJ and 6 μJ. The laser beam is optically focused and jointly projected to a confocal point with a diameter of 0.7 mm approximately. A 7.5 MHz 8-element annular array transducer with a hollow structure is machined to capture photoacoustic signal in backward mode. The captured signals excitated from blood glucose are processed with a synthetic focusing algorithm to obtain high signal-to-noise ratio and accurate location over a range of axial detection depth. The custom-made transducer with equal area elements is coaxially collimated with the laser source to improve the photoacoustic excite/receive efficiency. In the paper, we introduce the photoacoustic theory, receive/process technique, and design method of the portable noninvasive photoacoustic glucose monitoring system, which can potentially be developed as a powerful diagnosis and treatment tool for diabetes mellitus.

  13. Monitoring fetal heart rate during pregnancy: contributions from advanced signal processing and wearable technology.

    PubMed

    Signorini, Maria G; Fanelli, Andrea; Magenes, Giovanni

    2014-01-01

    Monitoring procedures are the basis to evaluate the clinical state of patients and to assess changes in their conditions, thus providing necessary interventions in time. Both these two objectives can be achieved by integrating technological development with methodological tools, thus allowing accurate classification and extraction of useful diagnostic information. The paper is focused on monitoring procedures applied to fetal heart rate variability (FHRV) signals, collected during pregnancy, in order to assess fetal well-being. The use of linear time and frequency techniques as well as the computation of non linear indices can contribute to enhancing the diagnostic power and reliability of fetal monitoring. The paper shows how advanced signal processing approaches can contribute to developing new diagnostic and classification indices. Their usefulness is evaluated by comparing two selected populations: normal fetuses and intra uterine growth restricted (IUGR) fetuses. Results show that the computation of different indices on FHRV signals, either linear and nonlinear, gives helpful indications to describe pathophysiological mechanisms involved in the cardiovascular and neural system controlling the fetal heart. As a further contribution, the paper briefly describes how the introduction of wearable systems for fetal ECG recording could provide new technological solutions improving the quality and usability of prenatal monitoring.

  14. A novel real-time data acquisition using an Excel spreadsheet in pendulum experiment tool with light-based timer

    NASA Astrophysics Data System (ADS)

    Adhitama, Egy; Fauzi, Ahmad

    2018-05-01

    In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.

  15. Structure-guided design and functional characterization of an artificial red light-regulated guanylate/adenylate cyclase for optogenetic applications.

    PubMed

    Etzl, Stefan; Lindner, Robert; Nelson, Matthew D; Winkler, Andreas

    2018-06-08

    Genetically targeting biological systems to control cellular processes with light is the concept of optogenetics. Despite impressive developments in this field, underlying molecular mechanisms of signal transduction of the employed photoreceptor modules are frequently not sufficiently understood to rationally design new optogenetic tools. Here, we investigate the requirements for functional coupling of red light-sensing phytochromes with non-natural enzymatic effectors by creating a series of constructs featuring the Deinococcus radiodurans bacteriophytochrome linked to a Synechocystis guanylate/adenylate cyclase. Incorporating characteristic structural elements important for cyclase regulation in our designs, we identified several red light-regulated fusions with promising properties. We provide details of one light-activated construct with low dark-state activity and high dynamic range that outperforms previous optogenetic tools in vitro and expands our in vivo toolkit, as demonstrated by manipulation of Caenorhabditis elegans locomotor activity. The full-length crystal structure of this phytochrome-linked cyclase revealed molecular details of photoreceptor-effector coupling, highlighting the importance of the regulatory cyclase element. Analysis of conformational dynamics by hydrogen-deuterium exchange in different functional states enriched our understanding of phytochrome signaling and signal integration by effectors. We found that light-induced conformational changes in the phytochrome destabilize the coiled-coil sensor-effector linker, which releases the cyclase regulatory element from an inhibited conformation, increasing cyclase activity of this artificial system. Future designs of optogenetic functionalities may benefit from our work, indicating that rational considerations for the effector improve the rate of success of initial designs to obtain optogenetic tools with superior properties. © 2018 Etzl et al.

  16. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    NASA Astrophysics Data System (ADS)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  17. Development of a Real-Time General-Purpose Digital Signal Processing Laboratory System.

    DTIC Science & Technology

    1983-12-01

    should serve several important purposes: to familiarize students with the use of common DSP tools in an instructional environment, to serve as a research ...of Dayton Research Institute researchers for DSP software and DSP system design insight. 3. Formulation of statement of requirements for development...Neither the University of Dayton nor its Research Institute have a DSP computer system. While UD offered no software or DSP system design information

  18. Excited-state vibronic wave-packet dynamics in H2 probed by XUV transient four-wave mixing

    NASA Astrophysics Data System (ADS)

    Cao, Wei; Warrick, Erika R.; Fidler, Ashley; Leone, Stephen R.; Neumark, Daniel M.

    2018-02-01

    The complex behavior of a molecular wave packet initiated by an extreme ultraviolet (XUV) pulse is investigated with noncollinear wave mixing spectroscopy. A broadband XUV pulse spanning 12-16 eV launches a wave packet in H2 comprising a coherent superposition of multiple electronic and vibrational levels. The molecular wave packet evolves freely until a delayed few-cycle optical laser pulse arrives to induce nonlinear signals in the XUV via four-wave mixing (FWM). The angularly resolved FWM signals encode rich energy exchange processes between the optical laser field and the XUV-excited molecule. The noncollinear geometry enables spatial separation of ladder and V- or Λ-type transitions induced by the optical field. Ladder transitions, in which the energy exchange with the optical field is around 3 eV, appear off axis from the incident XUV beam. Each vibrationally revolved FWM line probes a different part of the wave packet in energy, serving as a promising tool for energetic tomography of molecular wave packets. V- or Λ-type transitions, in which the energy exchange is well under 1 eV, result in on-axis nonlinear signals. The first-order versus third-order interference of the on-axis signal serves as a mapping tool of the energy flow pathways. Intra- and interelectronic potential energy curve transitions are decisively identified. The current study opens possibilities for accessing complete dynamic information in XUV-excited complex systems.

  19. Prospects of in vivo singlet oxygen luminescence monitoring: Kinetics at different locations on living mice.

    PubMed

    Pfitzner, Michael; Schlothauer, Jan C; Bastien, Estelle; Hackbarth, Steffen; Bezdetnaya, Lina; Lassalle, Henri-Pierre; Röder, Beate

    2016-06-01

    Singlet oxygen observation is considered a valuable tool to assess and optimize PDT treatment. In complex systems, such as tumors in vivo, only the direct, time-resolved singlet oxygen luminescence detection can give reliable information about generation and interaction of singlet oxygen. Up to now, evaluation of kinetics was not possible due to insufficient signal-to-noise ratio. Here we present high signal-to-noise ratio singlet oxygen luminescence kinetics obtained in mouse tumor model under PDT relevant conditions. A highly optimized system based on a custom made laser diode excitation source and a high aperture multi-furcated fiber, utilizing a photomultiplier tube with a multi photon counting device was used. Luminescence kinetics with unsurpassed signal-to-noise ratio were gained from tumor bearing nude mice in vivo upon topic application, subcutaneous injection as well as intravenous injection of different photosensitizers (chlorin e6 and dendrimer formulations of chlorin e6). Singlet oxygen kinetics in appropriate model systems are discussed to facilitate the interpretation of complex kinetics obtained from in vivo tumor tissue. This is the first study addressing the complexity of singlet oxygen luminescence kinetics in tumor tissue. At present, further investigations are needed to fully explain the processes involved. Nevertheless, the high signal-to-noise ratio proves the applicability of direct time-resolved singlet oxygen luminescence detection as a prospective tool for monitoring photodynamic therapy. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Localized nuclear and perinuclear Ca(2+) signals in intact mouse skeletal muscle fibers.

    PubMed

    Georgiev, Tihomir; Svirin, Mikhail; Jaimovich, Enrique; Fink, Rainer H A

    2015-01-01

    Nuclear Ca(2+) is important for the regulation of several nuclear processes such as gene expression. Localized Ca(2+) signals (LCSs) in skeletal muscle fibers of mice have been mainly studied as Ca(2+) release events from the sarcoplasmic reticulum. Their location with regard to cell nuclei has not been investigated. Our study is based on the hypothesis that LCSs associated with nuclei are present in skeletal muscle fibers of adult mice. Therefore, we carried out experiments addressing this question and we found novel Ca(2+) signals associated with nuclei of skeletal muscle fibers (with possibly attached satellite cells). We measured localized nuclear and perinuclear Ca(2+) signals (NLCSs and PLCSs) alongside cytosolic localized Ca(2+) signals (CLCSs) during a hypertonic treatment. We also observed NLCSs under isotonic conditions. The NLCSs and PLCSs are Ca(2+) signals in the range of micrometer [FWHM (full width at half maximum): 2.75 ± 0.27 μm (NLCSs) and 2.55 ± 0.17 μm (PLCSs), S.E.M.]. Additionally, global nuclear Ca(2+) signals (NGCSs) were observed. To investigate which type of Ca(2+) channels contribute to the Ca(2+) signals associated with nuclei in skeletal muscle fibers, we performed measurements with the RyR blocker dantrolene, the DHPR blocker nifedipine or the IP3R blocker Xestospongin C. We observed Ca(2+) signals associated with nuclei in the presence of each blocker. Nifedipine and dantrolene had an inhibitory effect on the fraction of fibers with PLCSs. The situation for the fraction of fibers with NLCSs is more complex indicating that RyR is less important for the generation of NLCSs compared to the generation of PLCSs. The fraction of fibers with NLCSs and PLCSs is not reduced in the presence of Xestospongin C. The localized perinuclear and intranuclear Ca(2+) signals may be a powerful tool for the cell to regulate adaptive processes as gene expression. The intranuclear Ca(2+) signals may be particularly interesting in this respect.

  1. Cloudwave: distributed processing of "big data" from electrophysiological recordings for epilepsy clinical research using Hadoop.

    PubMed

    Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S

    2013-01-01

    Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical "big data" consisting of more than 100 multi-channel signals with recordings from each patient generating 5-10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a "private cloud". Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with "montages" for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave).

  2. Cloudwave: Distributed Processing of “Big Data” from Electrophysiological Recordings for Epilepsy Clinical Research Using Hadoop

    PubMed Central

    Jayapandian, Catherine P.; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D.; Zhang, Guo-Qiang; Sahoo, Satya S.

    2013-01-01

    Epilepsy is the most common serious neurological disorder affecting 50–60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical “big data” consisting of more than 100 multi-channel signals with recordings from each patient generating 5–10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a “private cloud”. Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with “montages” for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave) PMID:24551370

  3. Anti-correlated Networks, Global Signal Regression, and the Effects of Caffeine in Resting-State Functional MRI

    PubMed Central

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T.

    2012-01-01

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. PMID:22743194

  4. Real-time Graphics Processing Unit Based Fourier Domain Optical Coherence Tomography and Surgical Applications

    NASA Astrophysics Data System (ADS)

    Zhang, Kang

    2011-12-01

    In this dissertation, real-time Fourier domain optical coherence tomography (FD-OCT) capable of multi-dimensional micrometer-resolution imaging targeted specifically for microsurgical intervention applications was developed and studied. As a part of this work several ultra-high speed real-time FD-OCT imaging and sensing systems were proposed and developed. A real-time 4D (3D+time) OCT system platform using the graphics processing unit (GPU) to accelerate OCT signal processing, the imaging reconstruction, visualization, and volume rendering was developed. Several GPU based algorithms such as non-uniform fast Fourier transform (NUFFT), numerical dispersion compensation, and multi-GPU implementation were developed to improve the impulse response, SNR roll-off and stability of the system. Full-range complex-conjugate-free FD-OCT was also implemented on the GPU architecture to achieve doubled image range and improved SNR. These technologies overcome the imaging reconstruction and visualization bottlenecks widely exist in current ultra-high speed FD-OCT systems and open the way to interventional OCT imaging for applications in guided microsurgery. A hand-held common-path optical coherence tomography (CP-OCT) distance-sensor based microsurgical tool was developed and validated. Through real-time signal processing, edge detection and feed-back control, the tool was shown to be capable of track target surface and compensate motion. The micro-incision test using a phantom was performed using a CP-OCT-sensor integrated hand-held tool, which showed an incision error less than +/-5 microns, comparing to >100 microns error by free-hand incision. The CP-OCT distance sensor has also been utilized to enhance the accuracy and safety of optical nerve stimulation. Finally, several experiments were conducted to validate the system for surgical applications. One of them involved 4D OCT guided micro-manipulation using a phantom. Multiple volume renderings of one 3D data set were performed with different view angles to allow accurate monitoring of the micro-manipulation, and the user to clearly monitor tool-to-target spatial relation in real-time. The system was also validated by imaging multiple biological samples, such as human fingerprint, human cadaver head and small animals. Compared to conventional surgical microscopes, GPU-based real-time FD-OCT can provide the surgeons with a real-time comprehensive spatial view of the microsurgical region and accurate depth perception.

  5. Metabolic Control of Redox and Redox Control of Metabolism in Plants

    PubMed Central

    Fernie, Alisdair R.

    2014-01-01

    Abstract Significance: Reduction-oxidation (Redox) status operates as a major integrator of subcellular and extracellular metabolism and is simultaneously itself regulated by metabolic processes. Redox status not only dominates cellular metabolism due to the prominence of NAD(H) and NADP(H) couples in myriad metabolic reactions but also acts as an effective signal that informs the cell of the prevailing environmental conditions. After relay of this information, the cell is able to appropriately respond via a range of mechanisms, including directly affecting cellular functioning and reprogramming nuclear gene expression. Recent Advances: The facile accession of Arabidopsis knockout mutants alongside the adoption of broad-scale post-genomic approaches, which are able to provide transcriptomic-, proteomic-, and metabolomic-level information alongside traditional biochemical and emerging cell biological techniques, has dramatically advanced our understanding of redox status control. This review summarizes redox status control of metabolism and the metabolic control of redox status at both cellular and subcellular levels. Critical Issues: It is becoming apparent that plastid, mitochondria, and peroxisome functions influence a wide range of processes outside of the organelles themselves. While knowledge of the network of metabolic pathways and their intraorganellar redox status regulation has increased in the last years, little is known about the interorganellar redox signals coordinating these networks. A current challenge is, therefore, synthesizing our knowledge and planning experiments that tackle redox status regulation at both inter- and intracellular levels. Future Directions: Emerging tools are enabling ever-increasing spatiotemporal resolution of metabolism and imaging of redox status components. Broader application of these tools will likely greatly enhance our understanding of the interplay of redox status and metabolism as well as elucidating and characterizing signaling features thereof. We propose that such information will enable us to dissect the regulatory hierarchies that mediate the strict coupling of metabolism and redox status which, ultimately, determine plant growth and development. Antioxid. Redox Signal. 21, 1389–1421. PMID:24960279

  6. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    PubMed

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  7. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome

    PubMed Central

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232

  8. Recordings of mucociliary activity in vivo: benefit of fast Fourier transformation of the photoelectric signal.

    PubMed

    Lindberg, S; Cervin, A; Runer, T; Thomasson, L

    1996-09-01

    Investigations of mucociliary activity in vivo are based on photoelectric recordings of light reflections from the mucosa. The alterations in light intensity produced by the beating cilia are picked up by a photodetector and converted to photoelectric signals. The optimal processing of these signals is not known, but in vitro recordings have been reported to benefit from fast Fourier transformation (FFT) of the signal. The aim of the investigation was to study the effect of FFT for frequency analysis of photoelectric signals originating from an artificial light source simulating mucociliary activity or from sinus or nasal mucosa in vivo, as compared to a conventional method of calculating mucociliary wave frequency, in which each peak in the signal is interpreted as a beat (old method). In the experiments with the artificial light source, the FFT system was superior to the conventional method by a factor of 50 in detecting weak signals. By using FFT signal processing, frequency could be correctly calculated in experiments with a compound signal. In experiments in the rabbit maxillary sinus, the spontaneous variations were greater when signals were processed by FFT. The correlation between the two methods was excellent: r = .92. The increase in mucociliary activity in response to the ciliary stimulant methacholine at a dosage of 0.5 microgram/kg was greater measured with the FFT than with the old method (55.3% +/- 8.3% versus 43.0% +/- 8.2%, p < .05, N = 8), and only with the FFT system could a significant effect of a threshold dose (0.05 microgram/kg) of methacholine be detected. In the human nose, recordings from aluminum foil placed on the nasal dorsum and from the nasal septa mucosa displayed some similarities in the lower frequency spectrum (< 5 Hz) attributable to artifacts. The predominant cause of these artifacts was the pulse beat, whereas in the frequency spectrum above 5 Hz, results differed for the two sources of reflected light, the mean frequency in seven healthy volunteers being 7.8 +/- 1.6 Hz for the human nasal mucosa. It is concluded that the FFT system has greater sensitivity in detecting photoelectric signals derived from the mucociliary system, and that it is also a useful tool for analyzing the contributions of artifacts to the signal.

  9. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight.

    PubMed

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2015-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI.

  10. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight

    PubMed Central

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2016-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI. PMID:26834607

  11. The research on visual industrial robot which adopts fuzzy PID control algorithm

    NASA Astrophysics Data System (ADS)

    Feng, Yifei; Lu, Guoping; Yue, Lulin; Jiang, Weifeng; Zhang, Ye

    2017-03-01

    The control system of six degrees of freedom visual industrial robot based on the control mode of multi-axis motion control cards and PC was researched. For the variable, non-linear characteristics of industrial robot`s servo system, adaptive fuzzy PID controller was adopted. It achieved better control effort. In the vision system, a CCD camera was used to acquire signals and send them to video processing card. After processing, PC controls the six joints` motion by motion control cards. By experiment, manipulator can operate with machine tool and vision system to realize the function of grasp, process and verify. It has influence on the manufacturing of the industrial robot.

  12. Imaging calcium sparks in cardiac myocytes.

    PubMed

    Guatimosim, Silvia; Guatimosim, Cristina; Song, Long-Sheng

    2011-01-01

    Calcium ions play fundamental roles in many cellular processes in virtually all type of cells. The use of Ca(2+) sensitive fluorescent indicators has proven to be an indispensable tool for studying the spatio-temporal dynamics of intracellular calcium ([Ca(2+)](i)). With the aid of laser scanning confocal microscopy and new generation of Ca(2+) indicators, highly localized, short-lived Ca(2+) signals, namely Ca(2+) sparks, were revealed as elementary Ca(2+) release events during excitation-contraction coupling in cardiomyocytes. Since the discovery of Ca(2+) sparks in 1993, the demonstration of dynamic Ca(2+) micro-domains in living cardiomyocytes has revolutionized our understanding of Ca(2+)-mediated signal transduction in normal and diseased hearts. In this chapter, we have described a commonly used method for recording local and global Ca(2+) signals in cardiomyocytes using the fluorescent indicator fluo-4 acetoxymethyl (AM) and laser scanning confocal microscopy.

  13. Direct digital conversion detector technology

    NASA Astrophysics Data System (ADS)

    Mandl, William J.; Fedors, Richard

    1995-06-01

    Future imaging sensors for the aerospace and commercial video markets will depend on low cost, high speed analog-to-digital (A/D) conversion to efficiently process optical detector signals. Current A/D methods place a heavy burden on system resources, increase noise, and limit the throughput. This paper describes a unique method for incorporating A/D conversion right on the focal plane array. This concept is based on Sigma-Delta sampling, and makes optimum use of the active detector real estate. Combined with modern digital signal processors, such devices will significantly increase data rates off the focal plane. Early conversion to digital format will also decrease the signal susceptibility to noise, lowering the communications bit error rate. Computer modeling of this concept is described, along with results from several simulation runs. A potential application for direct digital conversion is also reviewed. Future uses for this technology could range from scientific instruments to remote sensors, telecommunications gear, medical diagnostic tools, and consumer products.

  14. A Novel and Simple Spike Sorting Implementation.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2017-04-01

    Monitoring the activity of multiple, individual neurons that fire spikes in the vicinity of an electrode, namely perform a Spike Sorting (SS) procedure, comprises one of the most important tools for contemporary neuroscience in order to reverse-engineer the brain. As recording electrodes' technology rabidly evolves by integrating thousands of electrodes in a confined spatial setting, the algorithms that are used to monitor individual neurons from recorded signals have to become even more reliable and computationally efficient. In this work, we propose a novel framework of the SS approach in which a single-step processing of the raw (unfiltered) extracellular signal is sufficient for both the detection and sorting of the activity of individual neurons. Despite its simplicity, the proposed approach exhibits comparable performance with state-of-the-art approaches, especially for spike detection in noisy signals, and paves the way for a new family of SS algorithms with the potential for multi-recording, fast, on-chip implementations.

  15. Automatic detection of slight parameter changes associated to complex biomedical signals using multiresolution q-entropy1.

    PubMed

    Torres, M E; Añino, M M; Schlotthauer, G

    2003-12-01

    It is well known that, from a dynamical point of view, sudden variations in physiological parameters which govern certain diseases can cause qualitative changes in the dynamics of the corresponding physiological process. The purpose of this paper is to introduce a technique that allows the automated temporal localization of slight changes in a parameter of the law that governs the nonlinear dynamics of a given signal. This tool takes, from the multiresolution entropies, the ability to show these changes as statistical variations at each scale. These variations are held in the corresponding principal component. Appropriately combining these techniques with a statistical changes detector, a complexity change detection algorithm is obtained. The relevance of the approach, together with its robustness in the presence of moderate noise, is discussed in numerical simulations and the automatic detector is applied to real and simulated biological signals.

  16. Five Years of SETI with the Allen Telescope Array: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Harp, Gerald

    2016-01-01

    We discuss recent observations at the Allen Telescope Array (ATA) supporting a wide ranging Search for Extraterrestrial Intelligence (SETI). The ATA supports observations over the frequency range 1-10 GHz with three simultaneous phased array beams used in an anticoincidence detector for false positive rejection. Here we summarize observational results over the years 2011-2015 covering multiple campaigns of exoplanet stars, the galactic plane, infrared excess targets, etc. Approximately 2 x 108 signals were identified and classified over more than 5000 hours of observation. From these results we consider various approaches to the rapid identification of human generated interference in the process of the search for a signal with origins outside the radius of the Moon's orbit. We conclude that the multi-beam technique is superb tool for answering the very difficult question of the direction of origin of signals. Data-based simulations of future instruments with more than 3 beams are compared.

  17. Chromatibody, a novel non-invasive molecular tool to explore and manipulate chromatin in living cells

    PubMed Central

    Jullien, Denis; Vignard, Julien; Fedor, Yoann; Béry, Nicolas; Olichon, Aurélien; Crozatier, Michèle; Erard, Monique; Cassard, Hervé; Ducommun, Bernard; Salles, Bernard

    2016-01-01

    ABSTRACT Chromatin function is involved in many cellular processes, its visualization or modification being essential in many developmental or cellular studies. Here, we present the characterization of chromatibody, a chromatin-binding single-domain, and explore its use in living cells. This non-intercalating tool specifically binds the heterodimer of H2A–H2B histones and displays a versatile reactivity, specifically labeling chromatin from yeast to mammals. We show that this genetically encoded probe, when fused to fluorescent proteins, allows non-invasive real-time chromatin imaging. Chromatibody is a dynamic chromatin probe that can be modulated. Finally, chromatibody is an efficient tool to target an enzymatic activity to the nucleosome, such as the DNA damage-dependent H2A ubiquitylation, which can modify this epigenetic mark at the scale of the genome and result in DNA damage signaling and repair defects. Taken together, these results identify chromatibody as a universal non-invasive tool for either in vivo chromatin imaging or to manipulate the chromatin landscape. PMID:27206857

  18. Benefits of object-oriented models and ModeliChart: modern tools and methods for the interdisciplinary research on smart biomedical technology.

    PubMed

    Gesenhues, Jonas; Hein, Marc; Ketelhut, Maike; Habigt, Moriz; Rüschen, Daniel; Mechelinck, Mare; Albin, Thivaharan; Leonhardt, Steffen; Schmitz-Rode, Thomas; Rossaint, Rolf; Autschbach, Rüdiger; Abel, Dirk

    2017-04-01

    Computational models of biophysical systems generally constitute an essential component in the realization of smart biomedical technological applications. Typically, the development process of such models is characterized by a great extent of collaboration between different interdisciplinary parties. Furthermore, due to the fact that many underlying mechanisms and the necessary degree of abstraction of biophysical system models are unknown beforehand, the steps of the development process of the application are iteratively repeated when the model is refined. This paper presents some methods and tools to facilitate the development process. First, the principle of object-oriented (OO) modeling is presented and the advantages over classical signal-oriented modeling are emphasized. Second, our self-developed simulation tool ModeliChart is presented. ModeliChart was designed specifically for clinical users and allows independently performing in silico studies in real time including intuitive interaction with the model. Furthermore, ModeliChart is capable of interacting with hardware such as sensors and actuators. Finally, it is presented how optimal control methods in combination with OO models can be used to realize clinically motivated control applications. All methods presented are illustrated on an exemplary clinically oriented use case of the artificial perfusion of the systemic circulation.

  19. Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.

  20. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    PubMed

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.

  1. Friction Stir Welding of Metal Matrix Composites for use in aerospace structures

    NASA Astrophysics Data System (ADS)

    Prater, Tracie

    2014-01-01

    Friction Stir Welding (FSW) is a relatively nascent solid state joining technique developed at The Welding Institute (TWI) in 1991. The process was first used at NASA to weld the super lightweight external tank for the Space Shuttle. Today FSW is used to join structural components of the Delta IV, Atlas V, and Falcon IX rockets as well as the Orion Crew Exploration Vehicle. A current focus of FSW research is to extend the process to new materials which are difficult to weld using conventional fusion techniques. Metal Matrix Composites (MMCs) consist of a metal alloy reinforced with ceramics and have a very high strength to weight ratio, a property which makes them attractive for use in aerospace and defense applications. MMCs have found use in the space shuttle orbiter's structural tubing, the Hubble Space Telescope's antenna mast, control surfaces and propulsion systems for aircraft, and tank armors. The size of MMC components is severely limited by difficulties encountered in joining these materials using fusion welding. Melting of the material results in formation of an undesirable phase (formed when molten Aluminum reacts with the reinforcement) which leaves a strength depleted region along the joint line. Since FSW occurs below the melting point of the workpiece material, this deleterious phase is absent in FSW-ed MMC joints. FSW of MMCs is, however, plagued by rapid wear of the welding tool, a consequence of the large discrepancy in hardness between the steel tool and the reinforcement material. This work characterizes the effect of process parameters (spindle speed, traverse rate, and length of joint) on the wear process. Based on the results of these experiments, a phenomenological model of the wear process was constructed based on the rotating plug model for FSW. The effectiveness of harder tool materials (such as Tungsten Carbide, high speed steel, and tools with diamond coatings) to combat abrasive wear is explored. In-process force, torque, and vibration signals are analyzed to assess the feasibility of on-line monitoring of tool shape changes as a result of wear (an advancement which would eliminate the need for off-line evaluation of tool condition during joining). Monitoring, controlling, and reducing tool wear in FSW of MMCs is essential to the implementation of these materials in structures (such as launch vehicles) where they would be of maximum benefit.

  2. Beam position monitor engineering

    NASA Astrophysics Data System (ADS)

    Smith, Stephen R.

    1997-01-01

    The design of beam position monitors often involves challenging system design choices. Position transducers must be robust, accurate, and generate adequate position signal without unduly disturbing the beam. Electronics must be reliable and affordable, usually while meeting tough requirements on precision, accuracy, and dynamic range. These requirements may be difficult to achieve simultaneously, leading the designer into interesting opportunities for optimization or compromise. Some useful techniques and tools are shown. Both finite element analysis and analytic techniques will be used to investigate quasi-static aspects of electromagnetic fields such as the impedance of and the coupling of beam to striplines or buttons. Finite-element tools will be used to understand dynamic aspects of the electromagnetic fields of beams, such as wake fields and transmission-line and cavity effects in vacuum-to-air feedthroughs. Mathematical modeling of electrical signals through a processing chain will be demonstrated, in particular to illuminate areas where neither a pure time-domain nor a pure frequency-domain analysis is obviously advantageous. Emphasis will be on calculational techniques, in particular on using both time domain and frequency domain approaches to the applicable parts of interesting problems.

  3. Contrast in Terahertz Images of Archival Documents—Part II: Influence of Topographic Features

    NASA Astrophysics Data System (ADS)

    Bardon, Tiphaine; May, Robert K.; Taday, Philip F.; Strlič, Matija

    2017-04-01

    We investigate the potential of terahertz time-domain imaging in reflection mode to reveal archival information in documents in a non-invasive way. In particular, this study explores the parameters and signal processing tools that can be used to produce well-contrasted terahertz images of topographic features commonly found in archival documents, such as indentations left by a writing tool, as well as sieve lines. While the amplitude of the waveforms at a specific time delay can provide the most contrasted and legible images of topographic features on flat paper or parchment sheets, this parameter may not be suitable for documents that have a highly irregular surface, such as water- or fire-damaged documents. For analysis of such documents, cross-correlation of the time-domain signals can instead yield images with good contrast. Analysis of the frequency-domain representation of terahertz waveforms can also provide well-contrasted images of topographic features, with improved spatial resolution when utilising high-frequency content. Finally, we point out some of the limitations of these means of analysis for extracting information relating to topographic features of interest from documents.

  4. Visual-haptic integration with pliers and tongs: signal “weights” take account of changes in haptic sensitivity caused by different tools

    PubMed Central

    Takahashi, Chie; Watt, Simon J.

    2014-01-01

    When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245

  5. Deep sub-wavelength metrology for advanced defect classification

    NASA Astrophysics Data System (ADS)

    van der Walle, P.; Kramer, E.; van der Donck, J. C. J.; Mulckhuyse, W.; Nijsten, L.; Bernal Arango, F. A.; de Jong, A.; van Zeijl, E.; Spruit, H. E. T.; van den Berg, J. H.; Nanda, G.; van Langen-Suurling, A. K.; Alkemade, P. F. A.; Pereira, S. F.; Maas, D. J.

    2017-06-01

    Particle defects are important contributors to yield loss in semi-conductor manufacturing. Particles need to be detected and characterized in order to determine and eliminate their root cause. We have conceived a process flow for advanced defect classification (ADC) that distinguishes three consecutive steps; detection, review and classification. For defect detection, TNO has developed the Rapid Nano (RN3) particle scanner, which illuminates the sample from nine azimuth angles. The RN3 is capable of detecting 42 nm Latex Sphere Equivalent (LSE) particles on XXX-flat Silicon wafers. For each sample, the lower detection limit (LDL) can be verified by an analysis of the speckle signal, which originates from the surface roughness of the substrate. In detection-mode (RN3.1), the signal from all illumination angles is added. In review-mode (RN3.9), the signals from all nine arms are recorded individually and analyzed in order to retrieve additional information on the shape and size of deep sub-wavelength defects. This paper presents experimental and modelling results on the extraction of shape information from the RN3.9 multi-azimuth signal such as aspect ratio, skewness, and orientation of test defects. Both modeling and experimental work confirm that the RN3.9 signal contains detailed defect shape information. After review by RN3.9, defects are coarsely classified, yielding a purified Defect-of-Interest (DoI) list for further analysis on slower metrology tools, such as SEM, AFM or HIM, that provide more detailed review data and further classification. Purifying the DoI list via optical metrology with RN3.9 will make inspection time on slower review tools more efficient.

  6. myBrain: a novel EEG embedded system for epilepsy monitoring.

    PubMed

    Pinho, Francisco; Cerqueira, João; Correia, José; Sousa, Nuno; Dias, Nuno

    2017-10-01

    The World Health Organisation has pointed that a successful health care delivery, requires effective medical devices as tools for prevention, diagnosis, treatment and rehabilitation. Several studies have concluded that longer monitoring periods and outpatient settings might increase diagnosis accuracy and success rate of treatment selection. The long-term monitoring of epileptic patients through electroencephalography (EEG) has been considered a powerful tool to improve the diagnosis, disease classification, and treatment of patients with such condition. This work presents the development of a wireless and wearable EEG acquisition platform suitable for both long-term and short-term monitoring in inpatient and outpatient settings. The developed platform features 32 passive dry electrodes, analogue-to-digital signal conversion with 24-bit resolution and a variable sampling frequency from 250 Hz to 1000 Hz per channel, embedded in a stand-alone module. A computer-on-module embedded system runs a Linux ® operating system that rules the interface between two software frameworks, which interact to satisfy the real-time constraints of signal acquisition as well as parallel recording, processing and wireless data transmission. A textile structure was developed to accommodate all components. Platform performance was evaluated in terms of hardware, software and signal quality. The electrodes were characterised through electrochemical impedance spectroscopy and the operating system performance running an epileptic discrimination algorithm was evaluated. Signal quality was thoroughly assessed in two different approaches: playback of EEG reference signals and benchmarking with a clinical-grade EEG system in alpha-wave replacement and steady-state visual evoked potential paradigms. The proposed platform seems to efficiently monitor epileptic patients in both inpatient and outpatient settings and paves the way to new ambulatory clinical regimens as well as non-clinical EEG applications.

  7. Torsional vibration signal analysis as a diagnostic tool for planetary gear fault detection

    NASA Astrophysics Data System (ADS)

    Xue, Song; Howard, Ian

    2018-02-01

    This paper aims to investigate the effectiveness of using the torsional vibration signal as a diagnostic tool for planetary gearbox faults detection. The traditional approach for condition monitoring of the planetary gear uses a stationary transducer mounted on the ring gear casing to measure all the vibration data when the planet gears pass by with the rotation of the carrier arm. However, the time variant vibration transfer paths between the stationary transducer and the rotating planet gear modulate the resultant vibration spectra and make it complex. Torsional vibration signals are theoretically free from this modulation effect and therefore, it is expected to be much easier and more effective to diagnose planetary gear faults using the fault diagnostic information extracted from the torsional vibration. In this paper, a 20 degree of freedom planetary gear lumped-parameter model was developed to obtain the gear dynamic response. In the model, the gear mesh stiffness variations are the main internal vibration generation mechanism and the finite element models were developed for calculation of the sun-planet and ring-planet gear mesh stiffnesses. Gear faults on different components were created in the finite element models to calculate the resultant gear mesh stiffnesses, which were incorporated into the planetary gear model later on to obtain the faulted vibration signal. Some advanced signal processing techniques were utilized to analyses the fault diagnostic results from the torsional vibration. It was found that the planetary gear torsional vibration not only successfully detected the gear fault, but also had the potential to indicate the location of the gear fault. As a result, the planetary gear torsional vibration can be considered an effective alternative approach for planetary gear condition monitoring.

  8. The 21 cm signal and the interplay between dark matter annihilations and astrophysical processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Honorez, Laura; Mena, Olga; Moliné, Ángeles

    2016-08-01

    Future dedicated radio interferometers, including HERA and SKA, are very promising tools that aim to study the epoch of reionization and beyond via measurements of the 21 cm signal from neutral hydrogen. Dark matter (DM) annihilations into charged particles change the thermal history of the Universe and, as a consequence, affect the 21 cm signal. Accurately predicting the effect of DM strongly relies on the modeling of annihilations inside halos. In this work, we use up-to-date computations of the energy deposition rates by the products from DM annihilations, a proper treatment of the contribution from DM annihilations in halos, asmore » well as values of the annihilation cross section allowed by the most recent cosmological measurements from the Planck satellite. Given current uncertainties on the description of the astrophysical processes driving the epochs of reionization, X-ray heating and Lyman-α pumping, we find that disentangling DM signatures from purely astrophysical effects, related to early-time star formation processes or late-time galaxy X-ray emissions, will be a challenging task. We conclude that only annihilations of DM particles with masses of ∼100 MeV, could leave an unambiguous imprint on the 21 cm signal and, in particular, on the 21 cm power spectrum. This is in contrast to previous, more optimistic results in the literature, which have claimed that strong signatures might also be present even for much higher DM masses. Additional measurements of the 21 cm signal at different cosmic epochs will be crucial in order to break the strong parameter degeneracies between DM annihilations and astrophysical effects and undoubtedly single out a DM imprint for masses different from ∼100 MeV.« less

  9. Completed Ensemble Empirical Mode Decomposition: a Robust Signal Processing Tool to Identify Sequence Strata

    NASA Astrophysics Data System (ADS)

    Purba, H.; Musu, J. T.; Diria, S. A.; Permono, W.; Sadjati, O.; Sopandi, I.; Ruzi, F.

    2018-03-01

    Well logging data provide many geological information and its trends resemble nonlinear or non-stationary signals. As long well log data recorded, there will be external factors can interfere or influence its signal resolution. A sensitive signal analysis is required to improve the accuracy of logging interpretation which it becomes an important thing to determine sequence stratigraphy. Complete Ensemble Empirical Mode Decomposition (CEEMD) is one of nonlinear and non-stationary signal analysis method which decomposes complex signal into a series of intrinsic mode function (IMF). Gamma Ray and Spontaneous Potential well log parameters decomposed into IMF-1 up to IMF-10 and each of its combination and correlation makes physical meaning identification. It identifies the stratigraphy and cycle sequence and provides an effective signal treatment method for sequence interface. This method was applied to BRK- 30 and BRK-13 well logging data. The result shows that the combination of IMF-5, IMF-6, and IMF-7 pattern represent short-term and middle-term while IMF-9 and IMF-10 represent the long-term sedimentation which describe distal front and delta front facies, and inter-distributary mouth bar facies, respectively. Thus, CEEMD clearly can determine the different sedimentary layer interface and better identification of the cycle of stratigraphic base level.

  10. A novel analysis method for near infrared spectroscopy based on Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Zhou, Zhenyu; Yang, Hongyu; Liu, Yun; Ruan, Zongcai; Luo, Qingming; Gong, Hui; Lu, Zuhong

    2007-05-01

    Near Infrared Imager (NIRI) has been widely used to access the brain functional activity non-invasively. We use a portable, multi-channel and continuous-wave NIR topography instrument to measure the concentration changes of each hemoglobin species and map cerebral cortex functional activation. By extracting some essential features from the BOLD signals, optical tomography is able to be a new way of neuropsychological studies. Fourier spectral analysis provides a common framework for examining the distribution of global energy in the frequency domain. However, this method assumes that the signal should be stationary, which limits its application in non-stationary system. The hemoglobin species concentration changes are of such kind. In this work we develop a new signal processing method using Hilbert-Huang transform to perform spectral analysis of the functional NIRI signals. Compared with wavelet based multi-resolution analysis (MRA), we demonstrated the extraction of task related signal for observation of activation in the prefrontal cortex (PFC) in vision stimulation experiment. This method provides a new analysis tool for functional NIRI signals. Our experimental results show that the proposed approach provides the unique method for reconstructing target signal without losing original information and enables us to understand the episode of functional NIRI more precisely.

  11. Standards for data acquisition and software-based analysis of in vivo electroencephalography recordings from animals. A TASK1-WG5 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S

    2017-11-01

    Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  12. Application of power spectrum, cepstrum, higher order spectrum and neural network analyses for induction motor fault diagnosis

    NASA Astrophysics Data System (ADS)

    Liang, B.; Iwnicki, S. D.; Zhao, Y.

    2013-08-01

    The power spectrum is defined as the square of the magnitude of the Fourier transform (FT) of a signal. The advantage of FT analysis is that it allows the decomposition of a signal into individual periodic frequency components and establishes the relative intensity of each component. It is the most commonly used signal processing technique today. If the same principle is applied for the detection of periodicity components in a Fourier spectrum, the process is called the cepstrum analysis. Cepstrum analysis is a very useful tool for detection families of harmonics with uniform spacing or the families of sidebands commonly found in gearbox, bearing and engine vibration fault spectra. Higher order spectra (HOS) (also known as polyspectra) consist of higher order moment of spectra which are able to detect non-linear interactions between frequency components. For HOS, the most commonly used is the bispectrum. The bispectrum is the third-order frequency domain measure, which contains information that standard power spectral analysis techniques cannot provide. It is well known that neural networks can represent complex non-linear relationships, and therefore they are extremely useful for fault identification and classification. This paper presents an application of power spectrum, cepstrum, bispectrum and neural network for fault pattern extraction of induction motors. The potential for using the power spectrum, cepstrum, bispectrum and neural network as a means for differentiating between healthy and faulty induction motor operation is examined. A series of experiments is done and the advantages and disadvantages between them are discussed. It has been found that a combination of power spectrum, cepstrum and bispectrum plus neural network analyses could be a very useful tool for condition monitoring and fault diagnosis of induction motors.

  13. Metabolic control of redox and redox control of metabolism in plants.

    PubMed

    Geigenberger, Peter; Fernie, Alisdair R

    2014-09-20

    Reduction-oxidation (Redox) status operates as a major integrator of subcellular and extracellular metabolism and is simultaneously itself regulated by metabolic processes. Redox status not only dominates cellular metabolism due to the prominence of NAD(H) and NADP(H) couples in myriad metabolic reactions but also acts as an effective signal that informs the cell of the prevailing environmental conditions. After relay of this information, the cell is able to appropriately respond via a range of mechanisms, including directly affecting cellular functioning and reprogramming nuclear gene expression. The facile accession of Arabidopsis knockout mutants alongside the adoption of broad-scale post-genomic approaches, which are able to provide transcriptomic-, proteomic-, and metabolomic-level information alongside traditional biochemical and emerging cell biological techniques, has dramatically advanced our understanding of redox status control. This review summarizes redox status control of metabolism and the metabolic control of redox status at both cellular and subcellular levels. It is becoming apparent that plastid, mitochondria, and peroxisome functions influence a wide range of processes outside of the organelles themselves. While knowledge of the network of metabolic pathways and their intraorganellar redox status regulation has increased in the last years, little is known about the interorganellar redox signals coordinating these networks. A current challenge is, therefore, synthesizing our knowledge and planning experiments that tackle redox status regulation at both inter- and intracellular levels. Emerging tools are enabling ever-increasing spatiotemporal resolution of metabolism and imaging of redox status components. Broader application of these tools will likely greatly enhance our understanding of the interplay of redox status and metabolism as well as elucidating and characterizing signaling features thereof. We propose that such information will enable us to dissect the regulatory hierarchies that mediate the strict coupling of metabolism and redox status which, ultimately, determine plant growth and development.

  14. Flanking signal and mature peptide residues influence signal peptide cleavage

    PubMed Central

    Choo, Khar Heng; Ranganathan, Shoba

    2008-01-01

    Background Signal peptides (SPs) mediate the targeting of secretory precursor proteins to the correct subcellular compartments in prokaryotes and eukaryotes. Identifying these transient peptides is crucial to the medical, food and beverage and biotechnology industries yet our understanding of these peptides remains limited. This paper examines the most common type of signal peptides cleavable by the endoprotease signal peptidase I (SPase I), and the residues flanking the cleavage sites of three groups of signal peptide sequences, namely (i) eukaryotes (Euk) (ii) Gram-positive (Gram+) bacteria, and (iii) Gram-negative (Gram-) bacteria. Results In this study, 2352 secretory peptide sequences from a variety of organisms with amino-terminal SPs are extracted from the manually curated SPdb database for analysis based on physicochemical properties such as pI, aliphatic index, GRAVY score, hydrophobicity, net charge and position-specific residue preferences. Our findings show that the three groups share several similarities in general, but they display distinctive features upon examination in terms of their amino acid compositions and frequencies, and various physico-chemical properties. Thus, analysis or prediction of their sequences should be separated and treated as distinct groups. Conclusion We conclude that the peptide segment recognized by SPase I extends to the start of the mature protein to a limited extent, upon our survey of the amino acid residues surrounding the cleavage processing site. These flanking residues possibly influence the cleavage processing and contribute to non-canonical cleavage sites. Our findings are applicable in defining more accurate prediction tools for recognition and identification of cleavage site of SPs. PMID:19091014

  15. Internal robustness: systematic search for systematic bias in SN Ia data

    NASA Astrophysics Data System (ADS)

    Amendola, Luca; Marra, Valerio; Quartin, Miguel

    2013-04-01

    A great deal of effort is currently being devoted to understanding, estimating and removing systematic errors in cosmological data. In the particular case of Type Ia supernovae, systematics are starting to dominate the error budget. Here we propose a Bayesian tool for carrying out a systematic search for systematic contamination. This serves as an extension to the standard goodness-of-fit tests and allows not only to cross-check raw or processed data for the presence of systematics but also to pin-point the data that are most likely contaminated. We successfully test our tool with mock catalogues and conclude that the Union2.1 data do not possess a significant amount of systematics. Finally, we show that if one includes in Union2.1 the supernovae that originally failed the quality cuts, our tool signals the presence of systematics at over 3.8σ confidence level.

  16. Electromagnetic nondestructive evaluation of tempering process in AISI D2 tool steel

    NASA Astrophysics Data System (ADS)

    Kahrobaee, Saeed; Kashefi, Mehrdad

    2015-05-01

    The present paper investigates the potential of using eddy current technique as a reliable nondestructive tool to detect microstructural changes during the different stages of tempering treatment in AISI D2 tool steel. Five stages occur in tempering of the steel: precipitation of ɛ carbides, formation of cementite, retained austenite decomposition, secondary hardening effect and spheroidization of carbides. These stages were characterized by destructive methods, including dilatometry, differential scanning calorimetry, X-ray diffraction, scanning electron microscopic observations, and hardness measurements. The microstructural changes alter the electrical resistivity/magnetic saturation, which, in turn, influence the eddy current signals. Two EC parameters, induced voltage sensed by pickup coil and impedance point detected by excitation coil, were evaluated as a function of tempering temperature to characterize the microstructural features, nondestructively. The study revealed that a good correlation exists between the EC parameters and the microstructural changes.

  17. Rapid Prototyping of Application Specific Signal Processors (RASSP)

    DTIC Science & Technology

    1993-12-23

    Compilers 2-9 - Cadre Teamwork 2-13 - CodeCenter (Centerline) 2-15 - dbx/dbxtool (UNIXm) 2-17 - Falcon (Mentor) 2-19 - FrameMaker (Frame Tech) 2-21 - gprof...UNIXm C debuggers Falcon Mentor ECAD Framework FrameMaker Frame Tech Word Processing gcc GNU CIC++ compiler gprof GNU Software profiling tool...organization can put their own documentation on-line using the BOLD Com- poser for Framemaker . " The AMPLE programming language is a C like language used for

  18. Design and implementation of highly parallel pipelined VLSI systems

    NASA Astrophysics Data System (ADS)

    Delange, Alphonsus Anthonius Jozef

    A methodology and its realization as a prototype CAD (Computer Aided Design) system for the design and analysis of complex multiprocessor systems is presented. The design is an iterative process in which the behavioral specifications of the system components are refined into structural descriptions consisting of interconnections and lower level components etc. A model for the representation and analysis of multiprocessor systems at several levels of abstraction and an implementation of a CAD system based on this model are described. A high level design language, an object oriented development kit for tool design, a design data management system, and design and analysis tools such as a high level simulator and graphics design interface which are integrated into the prototype system and graphics interface are described. Procedures for the synthesis of semiregular processor arrays, and to compute the switching of input/output signals, memory management and control of processor array, and sequencing and segmentation of input/output data streams due to partitioning and clustering of the processor array during the subsequent synthesis steps, are described. The architecture and control of a parallel system is designed and each component mapped to a module or module generator in a symbolic layout library, compacted for design rules of VLSI (Very Large Scale Integration) technology. An example of the design of a processor that is a useful building block for highly parallel pipelined systems in the signal/image processing domains is given.

  19. Finite-difference time-domain modelling of through-the-Earth radio signal propagation

    NASA Astrophysics Data System (ADS)

    Ralchenko, M.; Svilans, M.; Samson, C.; Roper, M.

    2015-12-01

    This research seeks to extend the knowledge of how a very low frequency (VLF) through-the-Earth (TTE) radio signal behaves as it propagates underground, by calculating and visualizing the strength of the electric and magnetic fields for an arbitrary geology through numeric modelling. To achieve this objective, a new software tool has been developed using the finite-difference time-domain method. This technique is particularly well suited to visualizing the distribution of electromagnetic fields in an arbitrary geology. The frequency range of TTE radio (400-9000 Hz) and geometrical scales involved (1 m resolution for domains a few hundred metres in size) involves processing a grid composed of millions of cells for thousands of time steps, which is computationally expensive. Graphics processing unit acceleration was used to reduce execution time from days and weeks, to minutes and hours. Results from the new modelling tool were compared to three cases for which an analytic solution is known. Two more case studies were done featuring complex geologic environments relevant to TTE communications that cannot be solved analytically. There was good agreement between numeric and analytic results. Deviations were likely caused by numeric artifacts from the model boundaries; however, in a TTE application in field conditions, the uncertainty in the conductivity of the various geologic formations will greatly outweigh these small numeric errors.

  20. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms.

    PubMed

    Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio

    2012-10-18

    Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.

  1. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms

    PubMed Central

    2012-01-01

    Background Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Results Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Conclusions Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context. PMID:23079107

  2. A permanent seismic station beneath the Ocean Bottom

    NASA Astrophysics Data System (ADS)

    Harris, David; Cessaro, Robert K.; Duennebier, Fred K.; Byrne, David A.

    1987-03-01

    The Hawaii Institute of Geophysics began development of the Ocean Subbottom Seisometer (OSS) system in 1978, and OSS systems were installed in four locations between 1979 and 1982. The OSS system is a permanent, deep ocean borehole seismic recording system composed of a borehole sensor package (tool), an electromechanical cable, recorder package, and recovery system. Installed near the bottom of a borehole (drilled by the D/V Glomar Challenger), the tool contains three orthogonal, 4.5-Hz geophones, two orthogonal tilt meters; and a temperature sensor. Signals from these sensors are multiplexed, digitized (with a floating point technique), and telemetered through approximately 10 km of electromechanical cable to a recorder package located near the ocean bottom. Electrical power for the tool is supplied from the recorder package. The digital seismic signals are demultiplexed, converted back to analog form, processed through an automatic gain control (AGC) circuit, and recorded along with a time code on magnetic tape cassettes in the recorder package. Data may be recorded continuously for up to two months in the self-contained recorder package. Data may also be recorded in real time (digital formal) during the installation and subsequent recorder package servicing. The recorder package is connected to a submerged recovery buoy by a length of bouyant polypropylene rope. The anchor on the recovery buoy is released by activating either of the acoustical command releases. The polypropylene rope may also be seized with a grappling hook to effect recovery. The recorder package may be repeatedly serviced as long as the tool remains functional A wide range of data has been recovered from the OSS system. Recovered analog records include signals from natural seismic sources such as earthquakes (teleseismic and local), man-made seismic sources such as refraction seismic shooting (explosives and air cannons), and nuclear tests. Lengthy continuous recording has permitted analysis of wideband noise levels, and the slowly varying parameters, temperature and tilt.

  3. Multiscale modeling of mucosal immune responses

    PubMed Central

    2015-01-01

    Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787

  4. Sex steroid hormones and brain function: PET imaging as a tool for research.

    PubMed

    Moraga-Amaro, R; van Waarde, A; Doorduin, J; de Vries, E F J

    2018-02-01

    Sex steroid hormones are major regulators of sexual characteristic among species. These hormones, however, are also produced in the brain. Steroidal hormone-mediated signalling via the corresponding hormone receptors can influence brain function at the cellular level and thus affect behaviour and higher brain functions. Altered steroid hormone signalling has been associated with psychiatric disorders, such as anxiety and depression. Neurosteroids are also considered to have a neuroprotective effect in neurodegenerative diseases. So far, the role of steroid hormone receptors in physiological and pathological conditions has mainly been investigated post mortem on animal or human brain tissues. To study the dynamic interplay between sex steroids, their receptors, brain function and behaviour in psychiatric and neurological disorders in a longitudinal manner, however, non-invasive techniques are needed. Positron emission tomography (PET) is a non-invasive imaging tool that is used to quantitatively investigate a variety of physiological and biochemical parameters in vivo. PET uses radiotracers aimed at a specific target (eg, receptor, enzyme, transporter) to visualise the processes of interest. In this review, we discuss the current status of the use of PET imaging for studying sex steroid hormones in the brain. So far, PET has mainly been investigated as a tool to measure (changes in) sex hormone receptor expression in the brain, to measure a key enzyme in the steroid synthesis pathway (aromatase) and to evaluate the effects of hormonal treatment by imaging specific downstream processes in the brain. Although validated radiotracers for a number of targets are still warranted, PET can already be a useful technique for steroid hormone research and facilitate the translation of interesting findings in animal studies to clinical trials in patients. © 2017 The Authors. Journal of Neuroendocrinology published by John Wiley & Sons Ltd on behalf of British Society for Neuroendocrinology.

  5. Multiscale modeling of mucosal immune responses.

    PubMed

    Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep

    2015-01-01

    Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.

  6. A Study of Morrison's Iterative Noise Removal Method. Final Report M. S. Thesis

    NASA Technical Reports Server (NTRS)

    Ioup, G. E.; Wright, K. A. R.

    1985-01-01

    Morrison's iterative noise removal method is studied by characterizing its effect upon systems of differing noise level and response function. The nature of data acquired from a linear shift invariant instrument is discussed so as to define the relationship between the input signal, the instrument response function, and the output signal. Fourier analysis is introduced, along with several pertinent theorems, as a tool to more thorough understanding of the nature of and difficulties with deconvolution. In relation to such difficulties the necessity of a noise removal process is discussed. Morrison's iterative noise removal method and the restrictions upon its application are developed. The nature of permissible response functions is discussed, as is the choice of the response functions used.

  7. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    PubMed

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  8. Computer model for harmonic ultrasound imaging.

    PubMed

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  9. Computer model for harmonic ultrasound imaging.

    PubMed

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. Here, the authors present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  10. Scaling up digital circuit computation with DNA strand displacement cascades.

    PubMed

    Qian, Lulu; Winfree, Erik

    2011-06-03

    To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.

  11. Logical Modeling and Dynamical Analysis of Cellular Networks

    PubMed Central

    Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine

    2016-01-01

    The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434

  12. Two protein kinase C isoforms, δ and ε, regulate energy homeostasis in mitochondria by transmitting opposing signals to the pyruvate dehydrogenase complex.

    PubMed

    Gong, Jianli; Hoyos, Beatrice; Acin-Perez, Rebeca; Vinogradov, Valerie; Shabrova, Elena; Zhao, Feng; Leitges, Michael; Fischman, Donald; Manfredi, Giovanni; Hammerling, Ulrich

    2012-08-01

    Energy production in mitochondria is a multistep process that requires coordination of several subsystems. While reversible phosphorylation is emerging as the principal tool, it is still unclear how this signal network senses the workloads of processes as different as fuel procurement, catabolism in the Krebs cycle, and stepwise oxidation of reducing equivalents in the electron transfer chain. We previously proposed that mitochondria use oxidized cytochrome c in concert with retinol to activate protein kinase Cδ, thereby linking a prominent kinase network to the redox balance of the ETC. Here, we show that activation of PKCε in mitochondria also requires retinol as a cofactor, implying a redox-mechanism. Whereas activated PKCδ transmits a stimulatory signal to the pyruvate dehdyrogenase complex (PDHC), PKCε opposes this signal and inhibits the PDHC. Our results suggest that the balance between PKCδ and ε is of paramount importance not only for flux of fuel entering the Krebs cycle but for overall energy homeostasis. We observed that the synthetic retinoid fenretinide substituted for the retinol cofactor function but, on chronic use, distorted this signal balance, leading to predominance of PKCε over PKCδ. The suppression of the PDHC might explain the proapoptotic effect of fenretinide on tumor cells, as well as the diminished adiposity observed in experimental animals and humans. Furthermore, a disturbed balance between PKCδ and PKCε might underlie the injury inflicted on the ischemic myocardium during reperfusion. dehydrogenase complex.

  13. Research to Operations of Ionospheric Scintillation Detection and Forecasting

    NASA Astrophysics Data System (ADS)

    Jones, J.; Scro, K.; Payne, D.; Ruhge, R.; Erickson, B.; Andorka, S.; Ludwig, C.; Karmann, J.; Ebelhar, D.

    Ionospheric Scintillation refers to random fluctuations in phase and amplitude of electromagnetic waves caused by a rapidly varying refractive index due to turbulent features in the ionosphere. Scintillation of transionospheric UHF and L-Band radio frequency signals is particularly troublesome since this phenomenon can lead to degradation of signal strength and integrity that can negatively impact satellite communications and navigation, radar, or radio signals from other systems that traverse or interact with the ionosphere. Although ionospheric scintillation occurs in both the equatorial and polar regions of the Earth, the focus of this modeling effort is on equatorial scintillation. The ionospheric scintillation model is data-driven in a sense that scintillation observations are used to perform detection and characterization of scintillation structures. These structures are then propagated to future times using drift and decay models to represent the natural evolution of ionospheric scintillation. The impact on radio signals is also determined by the model and represented in graphical format to the user. A frequency scaling algorithm allows for impact analysis on frequencies other than the observation frequencies. The project began with lab-grade software and through a tailored Agile development process, deployed operational-grade code to a DoD operational center. The Agile development process promotes adaptive promote adaptive planning, evolutionary development, early delivery, continuous improvement, regular collaboration with the customer, and encourage rapid and flexible response to customer-driven changes. The Agile philosophy values individuals and interactions over processes and tools, working software over comprehensive documentation, customer collaboration over contract negotiation, and responding to change over following a rigid plan. The end result was an operational capability that met customer expectations. Details of the model and the process of operational integration are discussed as well as lessons learned to improve performance on future projects.

  14. Downhole tool adapted for telemetry

    DOEpatents

    Hall, David R.; Fox, Joe

    2010-12-14

    A cycleable downhole tool such as a Jar, a hydraulic hammer, and a shock absorber adapted for telemetry. This invention applies to other tools where the active components of the tool are displaced when the tool is rotationally or translationally cycled. The invention consists of inductive or contact transmission rings that are connected by an extensible conductor. The extensible conductor permits the transmission of the signal before, after, and during the cycling of the tool. The signal may be continuous or intermittent during cycling. The invention also applies to downhole tools that do not cycle, but in operation are under such stress that an extensible conductor is beneficial. The extensible conductor may also consist of an extensible portion and a fixed portion. The extensible conductor also features clamps that maintain the conductor under stresses greater than that seen by the tool, and seals that are capable of protecting against downhole pressure and contamination.

  15. Production and validation of recombinant adeno-associated virus for channelrhodopsin expression in neurons.

    PubMed

    Lin, John Y

    2013-01-01

    Recent discovery of the light-activated ion channel, channelrhodopsin (ChR), has provided researchers a powerful and convenient tool to manipulate the membrane potential of specific cells with light. With genetic targeting of these channels and illumination of light to a specific location, the experimenter can selectively activate the voltage-gated ion channels (VGICs) of ChR-expressing cells, initiating electrical signaling in temporally and spatially precise manners. In neuroscience research, this can be used to study electrical signal processing within one neuron at the cellular level, or the synaptic connectivity between neurons at the circuitry level. To conduct experiments with ChRs, these exogenous channels need to be introduced into the cells of interest, commonly through a viral approach. This chapter provides an overview of the design, production, and validation of recombinant adeno-associated virus (rAAV) for ChR expression that can be used in vitro or in vivo to infect neurons. The virus produced can be used to conduct "optogenetic" experiments in behaving animals, in vitro preparations and cultured cells, and can be used to study signal transduction and processing at a cellular or circuitry level.

  16. Role of EEG as Biomarker in the Early Detection and Classification of Dementia

    PubMed Central

    Al-Qazzaz, Noor Kamal; Ali, Sawal Hamid Bin MD.; Ahmad, Siti Anom; Chellappan, Kalaivani; Islam, Md. Shabiul; Escudero, Javier

    2014-01-01

    The early detection and classification of dementia are important clinical support tasks for medical practitioners in customizing patient treatment programs to better manage the development and progression of these diseases. Efforts are being made to diagnose these neurodegenerative disorders in the early stages. Indeed, early diagnosis helps patients to obtain the maximum treatment benefit before significant mental decline occurs. The use of electroencephalogram as a tool for the detection of changes in brain activities and clinical diagnosis is becoming increasingly popular for its capabilities in quantifying changes in brain degeneration in dementia. This paper reviews the role of electroencephalogram as a biomarker based on signal processing to detect dementia in early stages and classify its severity. The review starts with a discussion of dementia types and cognitive spectrum followed by the presentation of the effective preprocessing denoising to eliminate possible artifacts. It continues with a description of feature extraction by using linear and nonlinear techniques, and it ends with a brief explanation of vast variety of separation techniques to classify EEG signals. This paper also provides an idea from the most popular studies that may help in diagnosing dementia in early stages and classifying through electroencephalogram signal processing and analysis. PMID:25093211

  17. Imaging intraflagellar transport in mammalian primary cilia.

    PubMed

    Besschetnova, Tatiana Y; Roy, Barnali; Shah, Jagesh V

    2009-01-01

    The primary cilium is a specialized organelle that projects from the surface of many cell types. Unlike its motile counterpart it cannot beat but does transduce extracellular stimuli into intracellular signals and acts as a specialized subcellular compartment. The cilium is built and maintained by the transport of proteins and other biomolecules into and out of this compartment. The trafficking machinery for the cilium is referred to as IFT or intraflagellar transport. It was originally identified in the green algae Chlamydomonas and has been discovered throughout the evolutionary tree. The IFT machinery is widely conserved and acts to establish, maintain, and disassemble cilia and flagella. Understanding the role of IFT in cilium signaling and regulation requires a methodology for observing it directly. Here we describe current methods for observing the IFT process in mammalian primary cilia through the generation of fluorescent protein fusions and their expression in ciliated cell lines. The observation protocol uses high-resolution time-lapse microscopy to provide detailed quantitative measurements of IFT particle velocities in wild-type cells or in the context of genetic or other perturbations. Direct observation of IFT trafficking will provide a unique tool to dissect the processes that govern cilium regulation and signaling. 2009 Elsevier Inc. All rights reserved.

  18. On the retrieval of crystallographic information from atom probe microscopy data via signal mapping from the detector coordinate space.

    PubMed

    Wallace, Nathan D; Ceguerra, Anna V; Breen, Andrew J; Ringer, Simon P

    2018-06-01

    Atom probe tomography is a powerful microscopy technique capable of reconstructing the 3D position and chemical identity of millions of atoms within engineering materials, at the atomic level. Crystallographic information contained within the data is particularly valuable for the purposes of reconstruction calibration and grain boundary analysis. Typically, analysing this data is a manual, time-consuming and error prone process. In many cases, the crystallographic signal is so weak that it is difficult to detect at all. In this study, a new automated signal processing methodology is demonstrated. We use the affine properties of the detector coordinate space, or the 'detector stack', as the basis for our calculations. The methodological framework and the visualisation tools are shown to be superior to the standard method of crystallographic pole visualisation directly from field evaporation images and there is no requirement for iterations between a full real-space initial tomographic reconstruction and the detector stack. The mapping approaches are demonstrated for aluminium, tungsten, magnesium and molybdenum. Implications for reconstruction calibration, accuracy of crystallographic measurements, reliability and repeatability are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Role of EEG as biomarker in the early detection and classification of dementia.

    PubMed

    Al-Qazzaz, Noor Kamal; Ali, Sawal Hamid Bin Md; Ahmad, Siti Anom; Chellappan, Kalaivani; Islam, Md Shabiul; Escudero, Javier

    2014-01-01

    The early detection and classification of dementia are important clinical support tasks for medical practitioners in customizing patient treatment programs to better manage the development and progression of these diseases. Efforts are being made to diagnose these neurodegenerative disorders in the early stages. Indeed, early diagnosis helps patients to obtain the maximum treatment benefit before significant mental decline occurs. The use of electroencephalogram as a tool for the detection of changes in brain activities and clinical diagnosis is becoming increasingly popular for its capabilities in quantifying changes in brain degeneration in dementia. This paper reviews the role of electroencephalogram as a biomarker based on signal processing to detect dementia in early stages and classify its severity. The review starts with a discussion of dementia types and cognitive spectrum followed by the presentation of the effective preprocessing denoising to eliminate possible artifacts. It continues with a description of feature extraction by using linear and nonlinear techniques, and it ends with a brief explanation of vast variety of separation techniques to classify EEG signals. This paper also provides an idea from the most popular studies that may help in diagnosing dementia in early stages and classifying through electroencephalogram signal processing and analysis.

  20. Digital signal processing at Bell Labs-Foundations for speech and acoustics research

    NASA Astrophysics Data System (ADS)

    Rabiner, Lawrence R.

    2004-05-01

    Digital signal processing (DSP) is a fundamental tool for much of the research that has been carried out of Bell Labs in the areas of speech and acoustics research. The fundamental bases for DSP include the sampling theorem of Nyquist, the method for digitization of analog signals by Shannon et al., methods of spectral analysis by Tukey, the cepstrum by Bogert et al., and the FFT by Tukey (and Cooley of IBM). Essentially all of these early foundations of DSP came out of the Bell Labs Research Lab in the 1930s, 1940s, 1950s, and 1960s. This fundamental research was motivated by fundamental applications (mainly in the areas of speech, sonar, and acoustics) that led to novel design methods for digital filters (Kaiser, Golden, Rabiner, Schafer), spectrum analysis methods (Rabiner, Schafer, Allen, Crochiere), fast convolution methods based on the FFT (Helms, Bergland), and advanced digital systems used to implement telephony channel banks (Jackson, McDonald, Freeny, Tewksbury). This talk summarizes the key contributions to DSP made at Bell Labs, and illustrates how DSP was utilized in the areas of speech and acoustics research. It also shows the vast, worldwide impact of this DSP research on modern consumer electronics.

Top