Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls
NASA Astrophysics Data System (ADS)
Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.
2015-10-01
Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.
2004-01-01
login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a
Evolutionary neural networks for anomaly detection based on the behavior of a program.
Han, Sang-Jun; Cho, Sung-Bae
2006-06-01
The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.
2012-09-30
generalized power-law detection algorithm for humpback whale vocalizations. J. Acous. Soc. Am. 131(4), 2682-2699. Roch, M. A., H. Klinck, S...Heaney (2012b). Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones. J. Acous. Soc. Am...monitoring: Correcting humpback call detections for site-specific and time-dependent environmental characteristics . JASA Express Letters, submitted October, 2012, 5 pgs plus 3 figs.
Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu
2010-11-15
An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. Copyright © 2010 Elsevier B.V. All rights reserved.
The development of a super-fine-grained nuclear emulsion
NASA Astrophysics Data System (ADS)
Asada, Takashi; Naka, Tatsuhiro; Kuwabara, Ken-ichi; Yoshimoto, Masahiro
2017-06-01
A nuclear emulsion with micronized crystals is required for the tracking detection of submicron ionizing particles, which are one of the targets of dark-matter detection and other techniques. We found that a new production method, called the PVA—gelatin mixing method (PGMM), could effectively control crystal size from 20 nm to 50 nm. We called the two types of emulsion produced with the new method the nano imaging tracker and the ultra-nano imaging tracker. Their composition and spatial resolution were measured, and the results indicate that these emulsions detect extremely short tracks.
DROP: Detecting Return-Oriented Programming Malicious Code
NASA Astrophysics Data System (ADS)
Chen, Ping; Xiao, Hai; Shen, Xiaobin; Yin, Xinchun; Mao, Bing; Xie, Li
Return-Oriented Programming (ROP) is a new technique that helps the attacker construct malicious code mounted on x86/SPARC executables without any function call at all. Such technique makes the ROP malicious code contain no instruction, which is different from existing attacks. Moreover, it hides the malicious code in benign code. Thus, it circumvents the approaches that prevent control flow diversion outside legitimate regions (such as W ⊕ X ) and most malicious code scanning techniques (such as anti-virus scanners). However, ROP has its own intrinsic feature which is different from normal program design: (1) uses short instruction sequence ending in "ret", which is called gadget, and (2) executes the gadgets contiguously in specific memory space, such as standard GNU libc. Based on the features of the ROP malicious code, in this paper, we present a tool DROP, which is focused on dynamically detecting ROP malicious code. Preliminary experimental results show that DROP can efficiently detect ROP malicious code, and have no false positives and negatives.
The 'sniffer-patch' technique for detection of neurotransmitter release.
Allen, T G
1997-05-01
A wide variety of techniques have been employed for the detection and measurement of neurotransmitter release from biological preparations. Whilst many of these methods offer impressive levels of sensitivity, few are able to combine sensitivity with the necessary temporal and spatial resolution required to study quantal release from single cells. One detection method that is seeing a revival of interest and has the potential to fill this niche is the so-called 'sniffer-patch' technique. In this article, specific examples of the practical aspects of using this technique are discussed along with the procedures involved in calibrating these biosensors to extend their applications to provide quantitative, in addition to simple qualitative, measurements of quantal transmitter release.
Ultrasound Activated Contrast Imaging for Prostate Cancer Detection
2007-03-01
SUBTITLE 5a. CONTRACT NUMBER Ultrasound Activated Contrast Imaging for Prostate Cancer Detection 5b. GRANT NUMBER DAMD17-03-1-0119 5c. PROGRAM...ABSTRACT: The current project proposes todevelop a novel ultrasound contrast imaging technique (called EEI) for better visualization of the
Raman scattering spectroscopy for explosives identification
NASA Astrophysics Data System (ADS)
Nagli, L.; Gaft, M.
2007-04-01
Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. We applied gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. Raman system was developed and tested by LDS for field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.
Reflectometric measurement of plasma imaging and applications
NASA Astrophysics Data System (ADS)
Mase, A.; Ito, N.; Oda, M.; Komada, Y.; Nagae, D.; Zhang, D.; Kogi, Y.; Tobimatsu, S.; Maruyama, T.; Shimazu, H.; Sakata, E.; Sakai, F.; Kuwahara, D.; Yoshinaga, T.; Tokuzawa, T.; Nagayama, Y.; Kawahata, K.; Yamaguchi, S.; Tsuji-Iio, S.; Domier, C. W.; Luhmann, N. C., Jr.; Park, H. K.; Yun, G.; Lee, W.; Padhi, S.; Kim, K. W.
2012-01-01
Progress in microwave and millimeter-wave technologies has made possible advanced diagnostics for application to various fields, such as, plasma diagnostics, radio astronomy, alien substance detection, airborne and spaceborne imaging radars called as synthetic aperture radars, living body measurements. Transmission, reflection, scattering, and radiation processes of electromagnetic waves are utilized as diagnostic tools. In this report we focus on the reflectometric measurements and applications to biological signals (vital signal detection and breast cancer detection) as well as plasma diagnostics, specifically by use of imaging technique and ultra-wideband radar technique.
Standoff laser-based spectroscopy for explosives detection
NASA Astrophysics Data System (ADS)
Gaft, M.; Nagli, L.
2007-10-01
Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS activity is based on a combination of laser-based spectroscopic methods with orthogonal capabilities. Our technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We has applied optical techniques including gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.
Greene, Charles R; McLennan, Miles Wm; Norman, Robert G; McDonald, Trent L; Jakubczak, Ray S; Richardson, W John
2004-08-01
Bowhead whales, Balaena mysticetus, migrate west during fall approximately 10-75 km off the north coast of Alaska, passing the petroleum developments around Prudhoe Bay. Oil production operations on an artificial island 5 km offshore create sounds heard by some whales. As part of an effort to assess whether migrating whales deflect farther offshore at times with high industrial noise, an acoustical approach was selected for localizing calling whales. The technique incorporated DIFAR (directional frequency and recording) sonobuoy techniques. An array of 11 DASARs (directional autonomous seafloor acoustic recorders) was built and installed with unit-to-unit separation of 5 km. When two or more DASARs detected the same call, the whale location was determined from the bearing intersections. This article describes the acoustic methods used to determine the locations of the calling bowhead whales and shows the types and precision of the data acquired. Calibration transmissions at GPS-measured times and locations provided measures of the individual DASAR clock drift and directional orientation. The standard error of the bearing measurements at distances of 3-4 km was approximately 1.35 degrees after corrections for gain imbalance in the two directional sensors. During 23 days in 2002, 10,587 bowhead calls were detected and 8383 were localized.
Dynamic malware analysis using IntroVirt: a modified hypervisor-based system
NASA Astrophysics Data System (ADS)
White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.
2013-05-01
In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
McLeod, M.A.; Andersen, D.E.
1998-01-01
Forest-nesting raptors are often difficult to detect and monitor because they can be secretive, and their nests can be difficult to locate. Some species, however, respond to broadcasts of taped calls, and these responses may be useful both in monitoring population trends and in locating nests. We conducted broadcast surveys on roads and at active red-shouldered hawk (Buteo lineatus) nests in northcentral Minnesota to determine effects of type of call (conspecific or great horned owl [Bubo virginianus]), time of day, and phase of the breeding cycle on red-shouldered hawk response behavior and to evaluate usefulness of broadcasts as a population monitoring tool using area occupied-probability-of-detection techniques. During the breeding seasons of 1994 and 1995, we surveyed 4 10-station road transects 59 times and conducted 76 surveys at 24 active nests. Results of these surveys indicated conspecific calls broadcast prior to hatch and early in the day were the most effective method of detecting red-shouldered hawks. Probability of detection via conspecific calls averaged 0.25, and area occupied was 100%. Computer simulations using these field data indicated broadcast surveys have the potential to be used as a population monitoring tool.
A generalized baleen whale call detection and classification system.
Baumgartner, Mark F; Mussoline, Sarah E
2011-05-01
Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.
A New Forensic Picture Polygraph Technique for Terrorist and Crime Deception System
ERIC Educational Resources Information Center
Costello, R. H. Brian; Axton, JoAnn; Gold, Karen L.
2006-01-01
The Forensic Terrorist Detection System called Pinocchio Assessment Profile (PAP) employs standard issue polygraphs for a non-verbal picture technique originated as a biofeedback careers interest instrument. The system can be integrated readily into airport screening protocols. However, the method does not rely on questioning or foreign language…
2015-06-01
system accuracy. The AnRAD system was also generalized for the additional application of network intrusion detection . A self-structuring technique...to Host- based Intrusion Detection Systems using Contiguous and Discontiguous System Call Patterns,” IEEE Transactions on Computer, 63(4), pp. 807...square kilometer areas. The anomaly recognition and detection (AnRAD) system was built as a cogent confabulation network . It represented road
... The special camera and imaging techniques used in nuclear medicine include the gamma camera and single-photon emission-computed tomography (SPECT). The gamma camera, also called a scintillation camera, detects radioactive energy that is emitted from the patient's body and ...
... The special camera and imaging techniques used in nuclear medicine include the gamma camera and single-photon emission-computed tomography (SPECT). The gamma camera, also called a scintillation camera, detects radioactive energy that is emitted from the patient's body and ...
Wang, Shi-ping; He, Xin; Zhou, Yun-fei
2015-12-01
Schistosomiasis is a type of zoonotic parasitosis that severely impairs human health. Rapid detection of infection sources is a key to the control of schistosomiasis. With the effective control of schistosomiasis in China, the detection techniques for infection sources have also been developed. The rate and the intensity of infection among humans and livestocks have been significantly decreased in China, as the control program has entered the transmission control stage in most of the endemic areas. Under this situation, the traditional etiological diagnosing techniques and common immunological methods can not afford rapid detection of infection sources of schistosomiasis. Instead, we are calling for detection methods with higher sensitivity, specificity and stability while being less time-consuming, more convenient and less costing. In recent years, many improved or novel detection methods have been applied for the epidemiological surveillance of schistosomiasis, such as the automatic scanning microscopic image acquisition system, PCR-ELISA, immunosensors, loop-mediated isothermal amplification, etc. The development of new monitoring techniques can facilitate rapid detection of schistosome infection sources in endemic areas.
NASA Astrophysics Data System (ADS)
Helble, Tyler Adam
Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. Automated methods are needed to aid in the analyses of the recorded data. When a mammal vocalizes in the marine environment, the received signal is a filtered version of the original waveform emitted by the marine mammal. The waveform is reduced in amplitude and distorted due to propagation effects that are influenced by the bathymetry and environment. It is important to account for these effects to determine a site-specific probability of detection for marine mammal calls in a given study area. A knowledge of that probability function over a range of environmental and ocean noise conditions allows vocalization statistics from recordings of single, fixed, omnidirectional sensors to be compared across sensors and at the same sensor over time with less bias and uncertainty in the results than direct comparison of the raw statistics. This dissertation focuses on both the development of new tools needed to automatically detect humpback whale vocalizations from single-fixed omnidirectional sensors as well as the determination of the site-specific probability of detection for monitoring sites off the coast of California. Using these tools, detected humpback calls are "calibrated" for environmental properties using the site-specific probability of detection values, and presented as call densities (calls per square kilometer per time). A two-year monitoring effort using these calibrated call densities reveals important biological and ecological information on migrating humpback whales off the coast of California. Call density trends are compared between the monitoring sites and at the same monitoring site over time. Call densities also are compared to several natural and human-influenced variables including season, time of day, lunar illumination, and ocean noise. The results reveal substantial differences in call densities between the two sites which were not noticeable using uncorrected (raw) call counts. Additionally, a Lombard effect was observed for humpback whale vocalizations in response to increasing ocean noise. The results presented in this thesis develop techniques to accurately measure marine mammal abundances from passive acoustic sensors.
Cell culture-based biosensing techniques for detecting toxicity in water.
Tan, Lu; Schirmer, Kristin
2017-06-01
The significant increase of contaminants entering fresh water bodies calls for the development of rapid and reliable methods to monitor the aquatic environment and to detect water toxicity. Cell culture-based biosensing techniques utilise the overall cytotoxic response to external stimuli, mediated by a transduced signal, to specify the toxicity of aqueous samples. These biosensing techniques can effectively indicate water toxicity for human safety and aquatic organism health. In this review we account for the recent developments of the mainstream cell culture-based biosensing techniques for water quality evaluation, discuss their key features, potentials and limitations, and outline the future prospects of their development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing
2014-06-16
Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.
Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing
2014-01-01
Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas. PMID:24936948
Pulse-compression ghost imaging lidar via coherent detection.
Deng, Chenjin; Gong, Wenlin; Han, Shensheng
2016-11-14
Ghost imaging (GI) lidar, as a novel remote sensing technique, has been receiving increasing interest in recent years. By combining pulse-compression technique and coherent detection with GI, we propose a new lidar system called pulse-compression GI lidar. Our analytical results, which are backed up by numerical simulations, demonstrate that pulse-compression GI lidar can obtain the target's spatial intensity distribution, range and moving velocity. Compared with conventional pulsed GI lidar system, pulse-compression GI lidar, without decreasing the range resolution, is easy to obtain high single pulse energy with the use of a long pulse, and the mechanism of coherent detection can eliminate the influence of the stray light, which is helpful to improve the detection sensitivity and detection range.
Results from the MACHO Galactic Pixel Lensing Search
NASA Astrophysics Data System (ADS)
Drake, Andrew J.; Minniti, Dante; Alcock, Charles; Allsman, Robyn A.; Alves, David; Axelrod, Tim S.; Becker, Andrew C.; Bennett, David; Cook, Kem H.; Freeman, Ken C.; Griest, Kim; Lehner, Matt; Marshall, Stuart; Peterson, Bruce; Pratt, Mark; Quinn, Peter; Rodgers, Alex; Stubbs, Chris; Sutherland, Will; Tomaney, Austin; Vandehei, Thor; Welch, Doug L.
The MACHO, EROS, OGLE and AGAPE collaborations have been studying nature of the galactic halo for a number of years using microlensing events. The MACHO group undertakes observations of the LMC, SMC and Galactic Bulge monitoring the light curves of millions of stars to detect microlensing. Most of these fields are crowded to the extent that all the monitored stars are blended. Such crowding makes the performance of accurate photometry difficult. We apply the new technique of Difference Image Analysis (DIA) on archival data to improve the photometry and increase both the detection sensitivity and effective search area. The application of this technique also allows us to detect so called `pixel lensing' events. These are microlensing events where the source star is only detectable during lensing. The detection of these events will allow us to make a large increase in the number of detected microlensing events. We present a light curve demonstrating the detection of a pixel lensing event with this technique.
... The special camera and imaging techniques used in nuclear medicine include the gamma camera and single-photon emission-computed tomography (SPECT). The gamma camera, also called a scintillation camera, detects radioactive energy that is emitted from the patient's body and ...
Skeletal Scintigraphy (Bone Scan)
... The special camera and imaging techniques used in nuclear medicine include the gamma camera and single-photon emission-computed tomography (SPECT). The gamma camera, also called a scintillation camera, detects radioactive energy that is emitted from the patient's body and ...
Patra, Sarbani; Keshavamurthy, Srihari
2018-02-14
It has been known for sometime now that isomerization reactions, classically, are mediated by phase space structures called reactive islands (RI). RIs provide one possible route to correct for the nonstatistical effects in the reaction dynamics. In this work, we map out the reactive islands for the two dimensional Müller-Brown model potential and show that the reactive islands are intimately linked to the issue of rare event sampling. In particular, we establish the sensitivity of the so called committor probabilities, useful quantities in the transition path sampling technique, to the hierarchical RI structures. Mapping out the RI structure for high dimensional systems, however, is a challenging task. Here, we show that the technique of Lagrangian descriptors is able to effectively identify the RI hierarchy in the model system. Based on our results, we suggest that the Lagrangian descriptors can be useful for detecting RIs in high dimensional systems.
Wesch, Charlotte; Barthel, Anne-Kathrin; Braun, Ulrike; Klein, Roland; Paulus, Martin
2016-07-01
Monitoring the ingestion of microplastics is challenging and suitable detection techniques are insufficiently used. Thus, misidentifying natural for synthetic microfibres cannot be avoided. As part of a framework to monitor the ingestion of microplastics in eelpout, this short report addresses the accurate identification of microfibres. We show that, following visual inspections, putatively synthetic microfibres are indeed of natural origin, as ascertained by spectrometric analyses. Consequently, we call for an inclusion of spectroscopic techniques in standardized microplastic monitoring schemes. Copyright © 2016 Elsevier Inc. All rights reserved.
Infrared Imaging and Characterization of Exoplanets: Can we Detect Earth-Twins on a Budget?
NASA Technical Reports Server (NTRS)
Danchi, William
2010-01-01
During the past decade considerable progress has been made developing techniques that can be used to detect and characterize Earth twins in the mid- infrared (7-20 microns). The principal technique is called nulling interferometry, and it was invented by Bracewell in the late 1970's. The nulling technique is an interferometric equivalent of an optical coronagraph. At the present time most of the technological hurdles have been overcome for a space mission to be able to begin Phase A early in the next decade, and it is possible to detect and characterize Earth-twins on a mid- sized strategic mission budget ($600-800 million). I will review progress on this exciting method of planet detection in the context of recent work on the Exoplanet Community Forum and the US Decadal Survey (Astro2010), including biomarkers, technological progress, mission concepts, the theory of these instruments, and a.comparison of the discovery space of this technique with others also under consideration.
Baeten; Bruggeman; Paepen; Carchon
2000-03-01
The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.
Moon or Planet? The Exomoon Hunt Continues Artist Concept
2014-04-10
Researchers have detected the first exomoon candidate -- a moon orbiting a planet that lies outside our solar system. Using a technique called microlensing, they observed what could be either a moon and a planet -- or a planet and a star.
NASA Astrophysics Data System (ADS)
Pal, Siddharth; Basak, Aniruddha; Das, Swagatam
In many manufacturing areas the detection of surface defects is one of the most important processes in quality control. Currently in order to detect small scratches on solid surfaces most of the industries working on material manufacturing rely on visual inspection primarily. In this article we propose a hybrid computational intelligence technique to automatically detect a linear scratch from a solid surface and estimate its length (in pixel unit) simultaneously. The approach is based on a swarm intelligence algorithm called Ant Colony Optimization (ACO) and image preprocessing with Wiener and Sobel filters as well as the Canny edge detector. The ACO algorithm is mostly used to compensate for the broken parts of the scratch. Our experimental results confirm that the proposed technique can be used for detecting scratches from noisy and degraded images, even when it is very difficult for conventional image processing to distinguish the scratch area from its background.
Activateable Imaging Probes Light Up Inside Cancer Cells | Center for Cancer Research
Imaging can be used to help diagnose cancer as well as monitor tumor progression and response to treatment. The field of molecular imaging focuses on techniques capable of detecting specific molecular targets associated with cancer; the agents used for molecular imaging—often called probes—are multifunctional, with components that allow them to both interact with their molecular target and emit a detectable signal.
Coherent-Phase Monitoring Of Cavitation In Turbomachines
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1996-01-01
Digital electronic signal-processing system analyzes outputs of accelerometers mounted on turbomachine to detect vibrations characteristic of cavitation. Designed to overcome limitation imposed by interference from discrete components. System digitally implements technique called "coherent-phase wide-band demodulation" (CPWBD), using phase-only (PO) filtering along envelope detection to search for unique coherent-phase relationship associated with cavitation and to minimize influence of large-amplitude discrete components.
Bohnenstiehl, DelWayne R.; Eggleston, David B.; Kellogg, M. Lisa; Lyon, R. Patrick
2017-01-01
During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample. PMID:28792543
Ricci, Shannon W; Bohnenstiehl, DelWayne R; Eggleston, David B; Kellogg, M Lisa; Lyon, R Patrick
2017-01-01
During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample.
The ground truth about metadata and community detection in networks.
Peel, Leto; Larremore, Daniel B; Clauset, Aaron
2017-05-01
Across many scientific domains, there is a common need to automatically extract a simplified view or coarse-graining of how a complex system's components interact. This general task is called community detection in networks and is analogous to searching for clusters in independent vector data. It is common to evaluate the performance of community detection algorithms by their ability to find so-called ground truth communities. This works well in synthetic networks with planted communities because these networks' links are formed explicitly based on those known communities. However, there are no planted communities in real-world networks. Instead, it is standard practice to treat some observed discrete-valued node attributes, or metadata, as ground truth. We show that metadata are not the same as ground truth and that treating them as such induces severe theoretical and practical problems. We prove that no algorithm can uniquely solve community detection, and we prove a general No Free Lunch theorem for community detection, which implies that there can be no algorithm that is optimal for all possible community detection tasks. However, community detection remains a powerful tool and node metadata still have value, so a careful exploration of their relationship with network structure can yield insights of genuine worth. We illustrate this point by introducing two statistical techniques that can quantify the relationship between metadata and community structure for a broad class of models. We demonstrate these techniques using both synthetic and real-world networks, and for multiple types of metadata and community structures.
Comparison of VRX CT scanners geometries
NASA Astrophysics Data System (ADS)
DiBianca, Frank A.; Melnyk, Roman; Duckworth, Christopher N.; Russ, Stephan; Jordan, Lawrence M.; Laughter, Joseph S.
2001-06-01
A technique called Variable-Resolution X-ray (VRX) detection greatly increases the spatial resolution in computed tomography (CT) and digital radiography (DR) as the field size decreases. The technique is based on a principle called `projective compression' that allows both the resolution element and the sampling distance of a CT detector to scale with the subject or field size. For very large (40 - 50 cm) field sizes, resolution exceeding 2 cy/mm is possible and for very small fields, microscopy is attainable with resolution exceeding 100 cy/mm. This paper compares the benefits obtainable with two different VRX detector geometries: the single-arm geometry and the dual-arm geometry. The analysis is based on Monte Carlo simulations and direct calculations. The results of this study indicate that the dual-arm system appears to have more advantages than the single-arm technique.
CONFU: Configuration Fuzzing Testing Framework for Software Vulnerability Detection
Dai, Huning; Murphy, Christian; Kaiser, Gail
2010-01-01
Many software security vulnerabilities only reveal themselves under certain conditions, i.e., particular configurations and inputs together with a certain runtime environment. One approach to detecting these vulnerabilities is fuzz testing. However, typical fuzz testing makes no guarantees regarding the syntactic and semantic validity of the input, or of how much of the input space will be explored. To address these problems, we present a new testing methodology called Configuration Fuzzing. Configuration Fuzzing is a technique whereby the configuration of the running application is mutated at certain execution points, in order to check for vulnerabilities that only arise in certain conditions. As the application runs in the deployment environment, this testing technique continuously fuzzes the configuration and checks “security invariants” that, if violated, indicate a vulnerability. We discuss the approach and introduce a prototype framework called ConFu (CONfiguration FUzzing testing framework) for implementation. We also present the results of case studies that demonstrate the approach’s feasibility and evaluate its performance. PMID:21037923
Self-checking self-repairing computer nodes using the mirror processor
NASA Technical Reports Server (NTRS)
Tamir, Yuval
1992-01-01
Circuitry added to fault-tolerant systems for concurrent error deduction usually reduces performance. Using a technique called micro rollback, it is possible to eliminate most of the performance penalty of concurrent error detection. Error detection is performed in parallel with intermodule communication, and erroneous state changes are later undone. The author reports on the design and implementation of a VLSI RISC microprocessor, called the Mirror Processor (MP), which is capable of micro rollback. In order to achieve concurrent error detection, two MP chips operate in lockstep, comparing external signals and a signature of internal signals every clock cycle. If a mismatch is detected, both processors roll back to the beginning of the cycle when the error occurred. In some cases the erroneous state is corrected by copying a value from the fault-free processor to the faulty processor. The architecture, microarchitecture, and VLSI implementation of the MP, emphasizing its error-detection, error-recovery, and self-diagnosis capabilities, are described.
Early driver fatigue detection from electroencephalography signals using artificial neural networks.
King, L M; Nguyen, H T; Lal, S K L
2006-01-01
This paper describes a driver fatigue detection system using an artificial neural network (ANN). Using electroencephalogram (EEG) data sampled from 20 professional truck drivers and 35 non professional drivers, the time domain data are processed into alpha, beta, delta and theta bands and then presented to the neural network to detect the onset of driver fatigue. The neural network uses a training optimization technique called the magnified gradient function (MGF). This technique reduces the time required for training by modifying the standard back propagation (SBP) algorithm. The MGF is shown to classify professional driver fatigue with 81.49% accuracy (80.53% sensitivity, 82.44% specificity) and non-professional driver fatigue with 83.06% accuracy (84.04% sensitivity and 82.08% specificity).
Penalty dynamic programming algorithm for dim targets detection in sensor systems.
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.
Glass transition temperatures of liquid prepolymers obtained by thermal penetrometry
NASA Technical Reports Server (NTRS)
Potts, J. E., Jr.; Ashcraft, A. C.
1973-01-01
Thermal penetrometry is experimental technique for detecting temperature at which frozen prepolymer becomes soft enough to be pierced by weighted penetrometer needle; temperature at which this occurs is called penetration temperature. Apparatus used to obtain penetration temperatures can be set up largely from standard parts.
How nonlinear optics can merge interferometry for high resolution imaging
NASA Astrophysics Data System (ADS)
Ceus, D.; Reynaud, F.; Tonello, A.; Delage, L.; Grossard, L.
2017-11-01
High resolution stellar interferometers are very powerful efficient instruments to get a better knowledge of our Universe through the spatial coherence analysis of the light. For this purpose, the optical fields collected by each telescope Ti are mixed together. From the interferometric pattern, two expected information called the contrast Cij and the phase information φij are extracted. These information lead to the Vij, called the complex visibility, with Vij=Cijexp(jφij). For each telescope doublet TiTj, it is possible to get a complex visibility Vij. The Zernike Van Cittert theorem gives a relationship between the intensity distribution of the object observed and the complex visibility. The combination of the acquired complex visibilities and a reconstruction algorithm allows imaging reconstruction. To avoid lots of technical difficulties related to infrared optics (components transmission, thermal noises, thermal cooling…), our team proposes to explore the possibility of using nonlinear optical techniques. This is a promising alternative detection technique for detecting infrared optical signals. This way, we experimentally demonstrate that frequency conversion does not result in additional bias on the interferometric data supplied by a stellar interferometer. In this presentation, we report on wavelength conversion of the light collected by each telescope from the infrared domain to the visible. The interferometric pattern is observed in the visible domain with our, so called, upconversion interferometer. Thereby, one can benefit from mature optical components mainly used in optical telecommunications (waveguide, coupler, multiplexer…) and efficient low-noise detection schemes up to the single-photon counting level.
Detection of Spoofed MAC Addresses in 802.11 Wireless Networks
NASA Astrophysics Data System (ADS)
Tao, Kai; Li, Jing; Sampalli, Srinivas
Medium Access Control (MAC) address spoofing is considered as an important first step in a hacker's attempt to launch a variety of attacks on 802.11 wireless networks. Unfortunately, MAC address spoofing is hard to detect. Most current spoofing detection systems mainly use the sequence number (SN) tracking technique, which has drawbacks. Firstly, it may lead to an increase in the number of false positives. Secondly, such techniques cannot be used in systems with wireless cards that do not follow standard 802.11 sequence number patterns. Thirdly, attackers can forge sequence numbers, thereby causing the attacks to go undetected. We present a new architecture called WISE GUARD (Wireless Security Guard) for detection of MAC address spoofing on 802.11 wireless LANs. It integrates three detection techniques - SN tracking, Operating System (OS) fingerprinting & tracking and Received Signal Strength (RSS) fingerprinting & tracking. It also includes the fingerprinting of Access Point (AP) parameters as an extension to the OS fingerprinting for detection of AP address spoofing. We have implemented WISE GUARD on a test bed using off-the-shelf wireless devices and open source drivers. Experimental results show that the new design enhances the detection effectiveness and reduces the number of false positives in comparison with current approaches.
Dudzik, Grzegorz; Rzepka, Janusz; Abramski, Krzysztof M
2015-04-01
We present a concept of the polarization switching detection method implemented for frequency-stabilized lasers, called the polarization switching dichroic atomic vapor laser lock (PSDAVLL) technique. It is a combination of the well-known dichroic atomic vapor laser lock method for laser frequency stabilization with a synchronous detection system based on the surface-stabilized ferroelectric liquid crystal (SSFLC).The SSFLC is a polarization switch and quarter wave-plate component. This technique provides a 9.6 dB better dynamic range ratio (DNR) than the well-known two-photodiode detection configuration known as the balanced polarimeter. This paper describes the proposed method used practically in the VCSEL laser frequency stabilization system. The applied PSDAVLL method has allowed us to obtain a frequency stability of 2.7×10⁻⁹ and a reproducibility of 1.2×10⁻⁸, with a DNR of detected signals of around 81 dB. It has been shown that PSDAVLL might be successfully used as a method for spectra-stable laser sources.
Charged-particle emission tomography
Ding, Yijun; Caucci, Luca; Barrett, Harrison H.
2018-01-01
Purpose Conventional charged-particle imaging techniques —such as autoradiography —provide only two-dimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo charged-particle imaging with a window chamber or an endoscope. Methods Our approach to charged-particle emission tomography uses particle-processing detectors (PPDs) to estimate attributes of each detected particle. The attributes we estimate include location, direction of propagation, and/or the energy deposited in the detector. Estimated attributes are then fed into a reconstruction algorithm to reconstruct the 3D distribution of charged-particle-emitting radionuclides. Several setups to realize PPDs are designed. Reconstruction algorithms for CPET are developed. Results Reconstruction results from simulated data showed that a PPD enables CPET if the PPD measures more attributes than just the position from each detected particle. Experiments showed that a two-foil charged-particle detector is able to measure the position and direction of incident alpha particles. Conclusions We proposed a new volumetric imaging technique for charged-particle-emitting radionuclides, which we have called charged-particle emission tomography (CPET). We also proposed a new class of charged-particle detectors, which we have called particle-processing detectors (PPDs). When a PPD is used to measure the direction and/or energy attributes along with the position attributes, CPET is feasible. PMID:28370094
Charged-particle emission tomography.
Ding, Yijun; Caucci, Luca; Barrett, Harrison H
2017-06-01
Conventional charged-particle imaging techniques - such as autoradiography - provide only two-dimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo charged-particle imaging with a window chamber or an endoscope. Our approach to charged-particle emission tomography uses particle-processing detectors (PPDs) to estimate attributes of each detected particle. The attributes we estimate include location, direction of propagation, and/or the energy deposited in the detector. Estimated attributes are then fed into a reconstruction algorithm to reconstruct the 3D distribution of charged-particle-emitting radionuclides. Several setups to realize PPDs are designed. Reconstruction algorithms for CPET are developed. Reconstruction results from simulated data showed that a PPD enables CPET if the PPD measures more attributes than just the position from each detected particle. Experiments showed that a two-foil charged-particle detector is able to measure the position and direction of incident alpha particles. We proposed a new volumetric imaging technique for charged-particle-emitting radionuclides, which we have called charged-particle emission tomography (CPET). We also proposed a new class of charged-particle detectors, which we have called particle-processing detectors (PPDs). When a PPD is used to measure the direction and/or energy attributes along with the position attributes, CPET is feasible. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Object Recognition and Random Image Structure Evolution
ERIC Educational Resources Information Center
Sadr, Jvid; Sinha, Pawan
2004-01-01
We present a technique called Random Image Structure Evolution (RISE) for use in experimental investigations of high-level visual perception. Potential applications of RISE include the quantitative measurement of perceptual hysteresis and priming, the study of the neural substrates of object perception, and the assessment and detection of subtle…
Occurrence of Double Monoclonal Bands on Protein Electrophoresis: An Unusual Finding.
Srinivasan, Vishrut K; Bhagat, Priyanka; Bansal, Frainey; Chhabra, Seema
2016-06-01
Various techniques of protein electrophoresis are used for detection of monoclonal proteins/paraproteins in serum and/or urine of patients with monoclonal gammopathies. These are detected as the so-called 'M' bands (monoclonal bands) on serum protein electrophoresis and/or immunofixation electrophoresis. In most cases, a single M-band is detected. However, more than one M-band can be detected in the samples of a minor proportion of patients. This condition is termed as 'double gammopathy' or 'biclonal gammopathy'. A knowledge of such an unusual occurrence is essential for recognition and appropriate interpretation of this entity.
The ground truth about metadata and community detection in networks
Peel, Leto; Larremore, Daniel B.; Clauset, Aaron
2017-01-01
Across many scientific domains, there is a common need to automatically extract a simplified view or coarse-graining of how a complex system’s components interact. This general task is called community detection in networks and is analogous to searching for clusters in independent vector data. It is common to evaluate the performance of community detection algorithms by their ability to find so-called ground truth communities. This works well in synthetic networks with planted communities because these networks’ links are formed explicitly based on those known communities. However, there are no planted communities in real-world networks. Instead, it is standard practice to treat some observed discrete-valued node attributes, or metadata, as ground truth. We show that metadata are not the same as ground truth and that treating them as such induces severe theoretical and practical problems. We prove that no algorithm can uniquely solve community detection, and we prove a general No Free Lunch theorem for community detection, which implies that there can be no algorithm that is optimal for all possible community detection tasks. However, community detection remains a powerful tool and node metadata still have value, so a careful exploration of their relationship with network structure can yield insights of genuine worth. We illustrate this point by introducing two statistical techniques that can quantify the relationship between metadata and community structure for a broad class of models. We demonstrate these techniques using both synthetic and real-world networks, and for multiple types of metadata and community structures. PMID:28508065
Considerations in detecting CDC select agents under field conditions
NASA Astrophysics Data System (ADS)
Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul
2008-04-01
Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.
Protein blotting protocol for beginners.
Petrasovits, Lars A
2014-01-01
The transfer and immobilization of biological macromolecules onto solid nitrocellulose or nylon (polyvinylidene difluoride (PVDF)) membranes subsequently followed by specific detection is referred to as blotting. DNA blots are called Southerns after the inventor of the technique, Edwin Southern. By analogy, RNA blots are referred to as northerns and protein blots as westerns (Burnette, Anal Biochem 112:195-203, 1981). With few exceptions, western blotting involves five steps, namely, sample collection, preparation, separation, immobilization, and detection. In this chapter, protocols for the entire process from sample collection to detection are described.
Red-shouldered hawk occupancy surveys in central Minnesota, USA
Henneman, C.; McLeod, M.A.; Andersen, D.E.
2007-01-01
Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.
2016-12-01
Atlantic recordings is centered around 20 Hz, is the most often reported fin whale sound worldwide ( Watkins 1982; Edds 1988; Thompson et al. 1990... Watkins et al. 2000; Clark et al. 2002; Nieukirk et al. 2004; Širović et al. 2004; Castellote et al. 2012). However, only males have been found to produce...in feeding contexts without gender exception ( Watkins 1982). The frequency band of 40-Hz calls is generally 30–100 Hz, more often 40–75 Hz with
NASA Astrophysics Data System (ADS)
Hsu, David K.; Barnard, Daniel J.
1998-03-01
The galvanic action between steel fasteners and aluminum wing skins of aircraft often leads to hidden exfoliation corrosion around the countersink surface of the fastener heads. To detect and evaluate the severity of such corrosion defects, the Dripless Bubbler ultrasonic scanner was applied. This technique uses a focused beam of high frequency ultrasound in a closed-cycle, water-coupled scan of wing skin test panels containing corroded and uncorroded fasteners. With full waveform acquisition, not only the lateral extent but also the depth profile of the corrosions around the fastener heads were mapped out, subject to shadowing of defects at different depth. The technique is capable of providing quantitative assessment of the severity of the corrosion. In tests conducted to evaluate different techniques, the Dripless Bubbler has shown high probability of detection and low false call rate. The presence of paint on the surface did not degrade the performance of the technique. In addition, the Dripless Bubbler was also used on wing skin panels containing repair 'blend-out' regions that had 0.020' to 0.100' of metal removed from the surface by grinding. Corrosions around fasteners in the blend-out regions were also detected.
RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks
Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon
2009-01-01
The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the “Internet of things”. By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components. PMID:22412321
RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks.
Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon
2009-01-01
The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the "Internet of things". By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components.
Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T
2016-09-01
Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
An automated device for provoking and capturing wildlife calls
Ausband, David E.; Skrivseth, Jesse; Mitchell, Michael S.
2011-01-01
Some animals exhibit call-and-response behaviors that can be exploited to facilitate detection. Traditionally, acoustic surveys that use call-and-respond techniques have required an observer's presence to perform the broadcast, record the response, or both events. This can be labor-intensive and may influence animal behavior and, thus, survey results. We developed an automated acoustic survey device using commercially available hardware (e.g., laptop computer, speaker, microphone) and an author-created (JS) software program ("HOOT") that can be used to survey for any animal that calls. We tested this device to determine 1) deployment longevity, 2) effective sampling area, and 3) ability to detect known packs of gray wolves (Canis lupus) in Idaho, USA. Our device was able to broadcast and record twice daily for 6–7 days using the internal computer battery and surveyed an area of 3.3–17.5 km2 in relatively open habitat depending on the hardware components used. We surveyed for wolves at 2 active rendezvous sites used by closely monitored, radiocollared wolf packs and obtained 4 responses across both packs over 3 days of sampling. We confirmed reproduction in these 2 packs by detecting pup howls aurally from the resulting device recordings. Our device can broadcast and record animal calls and the computer software is freely downloadable. This automated survey device can be used to collect reliable data while reducing the labor costs traditionally associated with acoustic surveys.
An automated device for provoking and capturing Wildlife calls
Ausband, D.E.; Skrivseth, J.; Mitchell, M.S.
2011-01-01
Some animals exhibit call-and-response behaviors that can be exploited to facilitate detection. Traditionally, acoustic surveys that use call-and-respond techniques have required an observer's presence to perform the broadcast, record the response, or both events. This can be labor-intensive and may influence animal behavior and, thus, survey results. We developed an automated acoustic survey device using commercially available hardware (e.g., laptop computer, speaker, microphone) and an author-created (JS) software program ("HOOT") that can be used to survey for any animal that calls. We tested this device to determine 1) deployment longevity, 2) effective sampling area, and 3) ability to detect known packs of gray wolves (Canis lupus) in Idaho, USA. Our device was able to broadcast and record twice daily for 6-7 days using the internal computer battery and surveyed an area of 3.3-17.5 km in relatively open habitat depending on the hardware components used. We surveyed for wolves at 2 active rendezvous sites used by closely monitored, radiocollared wolf packs and obtained 4 responses across both packs over 3 days of sampling. We confirmed reproduction in these 2 packs by detecting pup howls aurally from the resulting device recordings. Our device can broadcast and record animal calls and the computer software is freely downloadable. This automated survey device can be used to collect reliable data while reducing the labor costs traditionally associated with acoustic surveys. ?? 2011 The Wildlife Society.
Anomaly-based intrusion detection for SCADA systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D.; Usynin, A.; Hines, J. W.
2006-07-01
Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less
2014-11-20
techniques to defend against stealthy malware, i.e., rootkits. For example, we have been developing new virtualization-based security service called AirBag ...for mobile devices. AirBag is a virtualization-based system that enables dynamic switching of (guest) Android im- ages in one VM, with one image
Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang
2016-01-01
Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.
NASA Astrophysics Data System (ADS)
Rios, Heriberto; Andrade, Armando; Puente, Ernestina; Lizana, Pablo R.; Mendoza, Diego
2002-11-01
The high intra-uterine death rate is due to failure in appropriately diagnosing some problems in the cardiobreathing system of the fetus during pregnancy. The electrocardiograph is one apparatus which might detect problems at an early stage. With electrodes located near the womb and uterus, in a way similar to the normal technique, the detection of so-called biopotential differences, caused by concentrations of ions, can be achieved. The fetal electrocardiograph is based on an ultrasound technique aimed at detecting intrauterine problems in pregnant women, because it is a noninvasive technique due to the very low level of ultrasound power used. With this system, the following tests can be done: Heart movements from the ninth week onwards; Rapid and safe diagnosis of intrauterine fetal death; Location and size of the placenta. The construction of the fetal electrocardiograph requires instrument level components directly mounted on the printed circuit board, in order to avoid stray capacitance in the cabling which prevents the detection of the E.C.G. activity. The low cost of the system makes it affordable to low budget institutions; in contrast, available commercial systems are priced in U.S. Dollars. (To be presented in Spanish.)
Severity Summarization and Just in Time Alert Computation in mHealth Monitoring.
Pathinarupothi, Rahul Krishnan; Alangot, Bithin; Rangan, Ekanath
2017-01-01
Mobile health is fast evolving into a practical solution to remotely monitor high-risk patients and deliver timely intervention in case of emergencies. Building upon our previous work on a fast and power efficient summarization framework for remote health monitoring applications, called RASPRO (Rapid Alerts Summarization for Effective Prognosis), we have developed a real-time criticality detection technique, which ensures meeting physician defined interventional time. We also present the results from initial testing of this technique.
Damage Detection Using Holography and Interferometry
NASA Technical Reports Server (NTRS)
Decker, Arthur J.
2003-01-01
This paper reviews classical approaches to damage detection using laser holography and interferometry. The paper then details the modern uses of electronic holography and neural-net-processed characteristic patterns to detect structural damage. The design of the neural networks and the preparation of the training sets are discussed. The use of a technique to optimize the training sets, called folding, is explained. Then a training procedure is detailed that uses the holography-measured vibration modes of the undamaged structures to impart damage-detection sensitivity to the neural networks. The inspections of an optical strain gauge mounting plate and an International Space Station cold plate are presented as examples.
RR-Interval variance of electrocardiogram for atrial fibrillation detection
NASA Astrophysics Data System (ADS)
Nuryani, N.; Solikhah, M.; Nugoho, A. S.; Afdala, A.; Anzihory, E.
2016-11-01
Atrial fibrillation is a serious heart problem originated from the upper chamber of the heart. The common indication of atrial fibrillation is irregularity of R peak-to-R-peak time interval, which is shortly called RR interval. The irregularity could be represented using variance or spread of RR interval. This article presents a system to detect atrial fibrillation using variances. Using clinical data of patients with atrial fibrillation attack, it is shown that the variance of electrocardiographic RR interval are higher during atrial fibrillation, compared to the normal one. Utilizing a simple detection technique and variances of RR intervals, we find a good performance of atrial fibrillation detection.
Penalty Dynamic Programming Algorithm for Dim Targets Detection in Sensor Systems
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations. PMID:22666074
On effectiveness of network sensor-based defense framework
NASA Astrophysics Data System (ADS)
Zhang, Difan; Zhang, Hanlin; Ge, Linqiang; Yu, Wei; Lu, Chao; Chen, Genshe; Pham, Khanh
2012-06-01
Cyber attacks are increasing in frequency, impact, and complexity, which demonstrate extensive network vulnerabilities with the potential for serious damage. Defending against cyber attacks calls for the distributed collaborative monitoring, detection, and mitigation. To this end, we develop a network sensor-based defense framework, with the aim of handling network security awareness, mitigation, and prediction. We implement the prototypical system and show its effectiveness on detecting known attacks, such as port-scanning and distributed denial-of-service (DDoS). Based on this framework, we also implement the statistical-based detection and sequential testing-based detection techniques and compare their respective detection performance. The future implementation of defensive algorithms can be provisioned in our proposed framework for combating cyber attacks.
NASA Astrophysics Data System (ADS)
Rand, Danielle; Derdak, Zoltan; Carlson, Rolf; Wands, Jack R.; Rose-Petruck, Christoph
2015-10-01
Hepatocellular carcinoma (HCC) is one of the most common malignant tumors worldwide and is almost uniformly fatal. Current methods of detection include ultrasound examination and imaging by CT scan or MRI; however, these techniques are problematic in terms of sensitivity and specificity, and the detection of early tumors (<1 cm diameter) has proven elusive. Better, more specific, and more sensitive detection methods are therefore urgently needed. Here we discuss the application of a newly developed x-ray imaging technique called Spatial Frequency Heterodyne Imaging (SFHI) for the early detection of HCC. SFHI uses x-rays scattered by an object to form an image and is more sensitive than conventional absorption-based x-radiography. We show that tissues labeled in vivo with gold nanoparticle contrast agents can be detected using SFHI. We also demonstrate that directed targeting and SFHI of HCC tumors in a mouse model is possible through the use of HCC-specific antibodies. The enhanced sensitivity of SFHI relative to currently available techniques enables the x-ray imaging of tumors that are just a few millimeters in diameter and substantially reduces the amount of nanoparticle contrast agent required for intravenous injection relative to absorption-based x-ray imaging.
Do alien particles exist, and can they be detected?
NASA Astrophysics Data System (ADS)
Gasperini, M.
2016-07-01
We may call “alien particles” those particles belonging to the matter/field content of a d-dimensional brane other than the 3-brane (or stack of branes) sweeping the spacetime in which we live. They can appear in our spacetime at the regions of intersection between our and their brane. They can be identified (or not) as alien matter depending on their properties, on the physical laws governing their evolution in the “homeland” brane, and on the details of our detection techniques.
Single-molecule fluorescence microscopy review: shedding new light on old problems
Shashkova, Sviatlana
2017-01-01
Fluorescence microscopy is an invaluable tool in the biosciences, a genuine workhorse technique offering exceptional contrast in conjunction with high specificity of labelling with relatively minimal perturbation to biological samples compared with many competing biophysical techniques. Improvements in detector and dye technologies coupled to advances in image analysis methods have fuelled recent development towards single-molecule fluorescence microscopy, which can utilize light microscopy tools to enable the faithful detection and analysis of single fluorescent molecules used as reporter tags in biological samples. For example, the discovery of GFP, initiating the so-called ‘green revolution’, has pushed experimental tools in the biosciences to a completely new level of functional imaging of living samples, culminating in single fluorescent protein molecule detection. Today, fluorescence microscopy is an indispensable tool in single-molecule investigations, providing a high signal-to-noise ratio for visualization while still retaining the key features in the physiological context of native biological systems. In this review, we discuss some of the recent discoveries in the life sciences which have been enabled using single-molecule fluorescence microscopy, paying particular attention to the so-called ‘super-resolution’ fluorescence microscopy techniques in live cells, which are at the cutting-edge of these methods. In particular, how these tools can reveal new insights into long-standing puzzles in biology: old problems, which have been impossible to tackle using other more traditional tools until the emergence of new single-molecule fluorescence microscopy techniques. PMID:28694303
NASA Astrophysics Data System (ADS)
DiBianca, Frank A.; Melnyk, Roman; Sambari, Aniket; Jordan, Lawrence M.; Laughter, Joseph S.; Zou, Ping
2000-04-01
A technique called Variable-Resolution X-ray (VRX) detection that greatly increases the spatial resolution in computed tomography (CT) and digital radiography (DR) is presented. The technique is based on a principle called 'projective compression' that allows the resolution element of a CT detector to scale with the subject or field size. For very large (40 - 50 cm) field sizes, resolution exceeding 2 cy/mm is possible and for very small fields, microscopy is attainable with resolution exceeding 100 cy/mm. Preliminary results from a 576-channel solid-state detector are presented. The detector has a dual-arm geometry and is comprised of CdWO4 scintillator crystals arranged in 24 modules of 24 channels/module. The scintillators are 0.85 mm wide and placed on 1 mm centers. Measurements of signal level, MTF and SNR, all versus detector angle, are presented.
An Ultrasonographic Periodontal Probe
NASA Astrophysics Data System (ADS)
Bertoncini, C. A.; Hinders, M. K.
2010-02-01
Periodontal disease, commonly known as gum disease, affects millions of people. The current method of detecting periodontal pocket depth is painful, invasive, and inaccurate. As an alternative to manual probing, an ultrasonographic periodontal probe is being developed to use ultrasound echo waveforms to measure periodontal pocket depth, which is the main measure of periodontal disease. Wavelet transforms and pattern classification techniques are implemented in artificial intelligence routines that can automatically detect pocket depth. The main pattern classification technique used here, called a binary classification algorithm, compares test objects with only two possible pocket depth measurements at a time and relies on dimensionality reduction for the final determination. This method correctly identifies up to 90% of the ultrasonographic probe measurements within the manual probe's tolerance.
DiBianca, F A; Gupta, V; Zeman, H D
2000-08-01
A computed tomography imaging technique called variable resolution x-ray (VRX) detection provides detector resolution ranging from that of clinical body scanning to that of microscopy (1 cy/mm to 100 cy/mm). The VRX detection technique is based on a new principle denoted as "projective compression" that allows the detector resolution element to scale proportionally to the image field size. Two classes of VRX detector geometry are considered. Theoretical aspects related to x-ray physics and data sampling are presented. Measured resolution parameters (line-spread function and modulation-transfer function) are presented and discussed. A VRX image that resolves a pair of 50 micron tungsten hairs spaced 30 microns apart is shown.
UV gated Raman spectroscopy for standoff detection of explosives
NASA Astrophysics Data System (ADS)
Gaft, M.; Nagli, L.
2008-07-01
Real-time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called improvised explosive devices (IED). It is recognized that the only method, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We have applied gated Raman spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 m.
SIRE: a MIMO radar for landmine/IED detection
NASA Astrophysics Data System (ADS)
Ojowu, Ode; Wu, Yue; Li, Jian; Nguyen, Lam
2013-05-01
Multiple-input multiple-output (MIMO) radar systems have been shown to have significant performance improvements over their single-input multiple-output (SIMO) counterparts. For transmit and receive elements that are collocated, the waveform diversity afforded by this radar is exploited for performance improvements. These improvements include but are not limited to improved target detection, improved parameter identifiability and better resolvability. In this paper, we present the Synchronous Impulse Reconstruction Radar (SIRE) Ultra-wideband (UWB) radar designed by the Army Research Lab (ARL) for landmine and improvised explosive device (IED) detection as a 2 by 16 MIMO radar (with collocated antennas). Its improvement over its SIMO counterpart in terms of beampattern/cross range resolution are discussed and demonstrated using simulated data herein. The limitations of this radar for Radio Frequency Interference (RFI) suppression are also discussed in this paper. A relaxation method (RELAX) combined with averaging of multiple realizations of the measured data is presented for RFI suppression; results show no noticeable target signature distortion after suppression. In this paper, the back-projection (delay and sum) data independent method is used for generating SAR images. A side-lobe minimization technique called recursive side-lobe minimization (RSM) is also discussed for reducing side-lobes in this data independent approach. We introduce a data-dependent sparsity based spectral estimation technique called Sparse Learning via Iterative Minimization (SLIM) as well as a data-dependent CLEAN approach for generating SAR images for the SIRE radar. These data-adaptive techniques show improvement in side-lobe reduction and resolution for simulated data for the SIRE radar.
NEAT: a spatial telescope to detect nearby exoplanets using astrometry
NASA Astrophysics Data System (ADS)
Crouzier, Antoine
2015-01-01
With the present state of exoplanet detection techniques, none of the rocky planets of the Solar System would be discovered, yet their presence is a very strong constraint on the scenarios of formation of planetary systems. Astrometry, by measuring the reflex effect of planets on their central host stars, lead us to the mass of planets and to their orbit determination. This technique is used frequently and is very successful to determine the masses and the orbits of binary stars. From space, it is possible to use differential astrometry around nearby Solar-type stars to detect exoplanets down to one Earth mass in habitable zone, where the sensitivity of the technique is optimal. Finding habitable Earths in the Solar neighborhood would be a major step forward for exoplanet detection and these planets would be prime targets for attempting to find life outside of the Solar System, by searching for bio-markers in their atmospheres. A scientific consortium has formed to promote this kind of astrometric space mission. A mission called NEAT (Nearby Earth Astrometric Telescope) has been proposed to ESA in 2010. A laboratory testbed called NEAT-demo was assembled at IPAG, its main goal is to demonstrate CCD detector calibration to the required accuracy. During my PhD, my activities were related to astrophysical aspects as well as instrumental aspects of the mission. Regarding the scientific case, I compiled a catalog of mission target stars and reference stars (needed for the differential astrometric measurements) and I estimated the scientific return of NEAT-like missions in terms of number of detected exoplanets and their parameter distributions. The second aspect of the PhD is relative to the testbed, which mimics the NEAT telescope configuration. I am going to present the testbed itself, the data analysis methods and the results. An accuracy of 3e-4 pixel was obtained for the relative positions of artificial stars and we have determined that measures of pixel positions by the metrology is currently limited by stray light.
Topics in the Detection of Gravitational Waves from Compact Binary Inspirals
NASA Astrophysics Data System (ADS)
Kapadia, Shasvath Jagat
Orbiting compact binaries - such as binary black holes, binary neutron stars and neutron star-black hole binaries - are among the most promising sources of gravitational waves observable by ground-based interferometric detectors. Despite numerous sophisticated engineering techniques, the gravitational wave signals will be buried deep within noise generated by various instrumental and environmental processes, and need to be extracted via a signal processing technique referred to as matched filtering. Matched filtering requires large banks of signal templates that are faithful representations of the true gravitational waveforms produced by astrophysical binaries. The accurate and efficient production of templates is thus crucial to the success of signal processing and data analysis. To that end, the dissertation presents a numerical technique that calibrates existing analytical (Post-Newtonian) waveforms, which are relatively inexpensive, to more accurate fiducial waveforms that are computationally expensive to generate. The resulting waveform family is significantly more accurate than the analytical waveforms, without incurring additional computational costs of production. Certain kinds of transient background noise artefacts, called "glitches'', can masquerade as gravitational wave signals for short durations and throw-off the matched-filter algorithm. Identifying glitches from true gravitational wave signals is a highly non-trivial exercise in data analysis which has been attempted with varying degrees of success. We present here a machine-learning based approach that exploits the various attributes of glitches and signals within detector data to provide a classification scheme that is a significant improvement over previous methods. The dissertation concludes by investigating the possibility of detecting a non-linear DC imprint, called the Christodoulou memory, produced in the arms of ground-based interferometers by the recently detected gravitational waves. The memory, which is even smaller in amplitude than the primary (detected) gravitational waves, will almost certainly not be seen in the current detection event. Nevertheless, future space-based detectors will likely be sensitive enough to observe the memory.
Detecting Heap-Spraying Code Injection Attacks in Malicious Web Pages Using Runtime Execution
NASA Astrophysics Data System (ADS)
Choi, Younghan; Kim, Hyoungchun; Lee, Donghoon
The growing use of web services is increasing web browser attacks exponentially. Most attacks use a technique called heap spraying because of its high success rate. Heap spraying executes a malicious code without indicating the exact address of the code by copying it into many heap objects. For this reason, the attack has a high potential to succeed if only the vulnerability is exploited. Thus, attackers have recently begun using this technique because it is easy to use JavaScript to allocate the heap memory area. This paper proposes a novel technique that detects heap spraying attacks by executing a heap object in a real environment, irrespective of the version and patch status of the web browser. This runtime execution is used to detect various forms of heap spraying attacks, such as encoding and polymorphism. Heap objects are executed after being filtered on the basis of patterns of heap spraying attacks in order to reduce the overhead of the runtime execution. Patterns of heap spraying attacks are based on analysis of how an web browser accesses benign web sites. The heap objects are executed forcibly by changing the instruction register into the address of them after being loaded into memory. Thus, we can execute the malicious code without having to consider the version and patch status of the browser. An object is considered to contain a malicious code if the execution reaches a call instruction and then the instruction accesses the API of system libraries, such as kernel32.dll and ws_32.dll. To change registers and monitor execution flow, we used a debugger engine. A prototype, named HERAD(HEap spRAying Detector), is implemented and evaluated. In experiments, HERAD detects various forms of exploit code that an emulation cannot detect, and some heap spraying attacks that NOZZLE cannot detect. Although it has an execution overhead, HERAD produces a low number of false alarms. The processing time of several minutes is negligible because our research focuses on detecting heap spraying. This research can be applied to existing systems that collect malicious codes, such as Honeypot.
Broad-Spectrum Molecular Detection of Fungal Nucleic Acids by PCR-Based Amplification Techniques.
Czurda, Stefan; Lion, Thomas
2017-01-01
Over the past decade, the incidence of life-threatening invasive fungal infections has dramatically increased. Infections caused by hitherto rare and emerging fungal pathogens are associated with significant morbidity and mortality among immunocompromised patients. These observations render the coverage of a broad range of clinically relevant fungal pathogens highly important. The so-called panfungal or, perhaps more correctly, broad-range nucleic acid amplification techniques do not only facilitate sensitive detection of all clinically relevant fungal species but are also rapid and can be applied to analyses of any patient specimens. They have therefore become valuable diagnostic tools for sensitive screening of patients at risk of invasive fungal infections. This chapter summarizes the currently available molecular technologies employed in testing of a wide range of fungal pathogens, and provides a detailed workflow for patient screening by broad-spectrum nucleic acid amplification techniques.
Restoration of out-of-focus images based on circle of confusion estimate
NASA Astrophysics Data System (ADS)
Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto
2002-11-01
In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.
Visualization of delamination in composite materials utilizing advanced X-ray imaging techniques
NASA Astrophysics Data System (ADS)
Vavrik, D.; Jakubek, J.; Jandejsek, I.; Krejci, F.; Kumpova, I.; Zemlicka, J.
2015-04-01
This work is focused on the development of instrumental radiographic methods for detection of delaminations in layered carbon fibre reinforced plastic composites used in the aerospace industry. The main limitation of current visualisation techniques is a very limited possibility to image so-called closed delaminations in which delaminated layers are in contact practically with no physical gap. In this contribution we report the development of innovative methods for closed delamination detection using an X-ray phase contrast technique for which the distance between delamination surfaces is not relevant. The approach is based on the energetic sensitivity of phase-enhanced radiography. Based on the applied methodology, we can distinguish both closed and open delamination. Further we have demonstrated the possibility to visualise open delaminations characterised by a physical gap between delaminated layers. This delamination type was successfully identified and visualized utilizing a high resolution and computed tomography table-top technique based on proper beam-hardening effect correction.
Causal Entropies – a measure for determining changes in the temporal organization of neural systems
Waddell, Jack; Dzakpasu, Rhonda; Booth, Victoria; Riley, Brett; Reasor, Jonathan; Poe, Gina; Zochowski, Michal
2009-01-01
We propose a novel measure to detect temporal ordering in the activity of individual neurons in a local network, which is thought to be a hallmark of activity-dependent synaptic modifications during learning. The measure, called Causal Entropy, is based on the time-adaptive detection of asymmetries in the relative temporal patterning between neuronal pairs. We characterize properties of the measure on both simulated data and experimental multiunit recordings of hippocampal neurons from the awake, behaving rat, and show that the metric can more readily detect those asymmetries than standard cross correlation-based techniques, especially since the temporal sensitivity of causal entropy can detect such changes rapidly and dynamically. PMID:17275095
Chroma intra prediction based on inter-channel correlation for HEVC.
Zhang, Xingyu; Gisquet, Christophe; François, Edouard; Zou, Feng; Au, Oscar C
2014-01-01
In this paper, we investigate a new inter-channel coding mode called LM mode proposed for the next generation video coding standard called high efficiency video coding. This mode exploits inter-channel correlation using reconstructed luma to predict chroma linearly with parameters derived from neighboring reconstructed luma and chroma pixels at both encoder and decoder to avoid overhead signaling. In this paper, we analyze the LM mode and prove that the LM parameters for predicting original chroma and reconstructed chroma are statistically the same. We also analyze the error sensitivity of the LM parameters. We identify some LM mode problematic situations and propose three novel LM-like modes called LMA, LML, and LMO to address the situations. To limit the increase in complexity due to the LM-like modes, we propose some fast algorithms with the help of some new cost functions. We further identify some potentially-problematic conditions in the parameter estimation (including regression dilution problem) and introduce a novel model correction technique to detect and correct those conditions. Simulation results suggest that considerable BD-rate reduction can be achieved by the proposed LM-like modes and model correction technique. In addition, the performance gain of the two techniques appears to be essentially additive when combined.
NASA Astrophysics Data System (ADS)
van Es, Maarten H.; Mohtashami, Abbas; Piras, Daniele; Sadeghian, Hamed
2018-03-01
Nondestructive subsurface nanoimaging through optically opaque media is considered to be extremely challenging and is essential for several semiconductor metrology applications including overlay and alignment and buried void and defect characterization. The current key challenge in overlay and alignment is the measurement of targets that are covered by optically opaque layers. Moreover, with the device dimensions moving to the smaller nodes and the issue of the so-called loading effect causing offsets between between targets and product features, it is increasingly desirable to perform alignment and overlay on product features or so-called on-cell overlay, which requires higher lateral resolution than optical methods can provide. Our recently developed technique known as SubSurface Ultrasonic Resonance Force Microscopy (SSURFM) has shown the capability for high-resolution imaging of structures below a surface based on (visco-)elasticity of the constituent materials and as such is a promising technique to perform overlay and alignment with high resolution in upcoming production nodes. In this paper, we describe the developed SSURFM technique and the experimental results on imaging buried features through various layers and the ability to detect objects with resolution below 10 nm. In summary, the experimental results show that the SSURFM is a potential solution for on-cell overlay and alignment as well as detecting buried defects or voids and generally metrology through optically opaque layers.
Multirobot autonomous landmine detection using distributed multisensor information aggregation
NASA Astrophysics Data System (ADS)
Jumadinova, Janyl; Dasgupta, Prithviraj
2012-06-01
We consider the problem of distributed sensor information fusion by multiple autonomous robots within the context of landmine detection. We assume that different landmines can be composed of different types of material and robots are equipped with different types of sensors, while each robot has only one type of landmine detection sensor on it. We introduce a novel technique that uses a market-based information aggregation mechanism called a prediction market. Each robot is provided with a software agent that uses sensory input of the robot and performs calculations of the prediction market technique. The result of the agent's calculations is a 'belief' representing the confidence of the agent in identifying the object as a landmine. The beliefs from different robots are aggregated by the market mechanism and passed on to a decision maker agent. The decision maker agent uses this aggregate belief information about a potential landmine and makes decisions about which other robots should be deployed to its location, so that the landmine can be confirmed rapidly and accurately. Our experimental results show that, for identical data distributions and settings, using our prediction market-based information aggregation technique increases the accuracy of object classification favorably as compared to two other commonly used techniques.
Early diagnosis of tongue malignancy using laser induced fluorescence spectroscopy technique
NASA Astrophysics Data System (ADS)
Patil, Ajeetkumar; Unnikrishnan V., K.; Ongole, Ravikiran; Pai, Keerthilatha M.; Kartha, V. B.; Chidangil, Santhosh
2015-07-01
Oral cancer together with pharyngeal cancer is the sixth most common malignancy reported worldwide and one with high mortality ratio among all malignancies [1]. Worldwide 450,000 new cases are estimated in 2014[2]. About 90% are a type of cancer called squamous cell carcinoma (SCC). SCC of the tongue is the most common oral malignancy accounting for approximately 40% of all oral carcinomas. One of the important factors for successful therapy of any malignancy is early diagnosis. Although considerable progress has been made in understanding the cellular and molecular mechanisms of tumorigenesis, lack of reliable diagnostic methods for early detection leading to delay in therapy is an important factor responsible for the increase in the mortality rate in various types of cancers. Spectroscopy techniques are extremely sensitive for the analysis of biochemical changes in cellular systems. These techniques can provide a valuable information on alterations that occur during the development of cancer. This is especially important in oral cancer, where "tumor detection is complicated by a tendency towards field cancerization, leading to multi-centric lesions" and "current techniques detect malignant change too late" [3], and "biopsies are not representative of the whole premalignant lesion". [4
Pattern-histogram-based temporal change detection using personal chest radiographs
NASA Astrophysics Data System (ADS)
Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki
1999-05-01
An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.
Activateable Imaging Probes Light Up Inside Cancer Cells | Center for Cancer Research
Imaging can be used to help diagnose cancer as well as monitor tumor progression and response to treatment. The field of molecular imaging focuses on techniques capable of detecting specific molecular targets associated with cancer; the agents used for molecular imaging—often called probes—are multifunctional, with components that allow them to both interact with their
Automated detection of jet contrails using the AVHRR split window
NASA Technical Reports Server (NTRS)
Engelstad, M.; Sengupta, S. K.; Lee, T.; Welch, R. M.
1992-01-01
This paper investigates the automated detection of jet contrails using data from the Advanced Very High Resolution Radiometer. A preliminary algorithm subtracts the 11.8-micron image from the 10.8-micron image, creating a difference image on which contrails are enhanced. Then a three-stage algorithm searches the difference image for the nearly-straight line segments which characterize contrails. First, the algorithm searches for elevated, linear patterns called 'ridges'. Second, it applies a Hough transform to the detected ridges to locate nearly-straight lines. Third, the algorithm determines which of the nearly-straight lines are likely to be contrails. The paper applies this technique to several test scenes.
Advanced sensor-simulation capability
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.
1990-09-01
This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.
Rand, Danielle; Derdak, Zoltan; Carlson, Rolf; ...
2015-10-29
Hepatocellular carcinoma (HCC) is one of the most common malignant tumors worldwide and is almost uniformly fatal. Current methods of detection include ultrasound examination and imaging by CT scan or MRI; however, these techniques are problematic in terms of sensitivity and specificity, and the detection of early tumors (<1 cm diameter) has proven elusive. Better, more specific, and more sensitive detection methods are therefore urgently needed. Here we discuss the application of a newly developed x-ray imaging technique called Spatial Frequency Heterodyne Imaging (SFHI) for the early detection of HCC. SFHI uses x-rays scattered by an object to form anmore » image and is more sensitive than conventional absorption-based x-radiography. We show that tissues labeled in vivo with gold nanoparticle contrast agents can be detected using SFHI. We also demonstrate that directed targeting and SFHI of HCC tumors in a mouse model is possible through the use of HCC-specific antibodies. As a result, the enhanced sensitivity of SFHI relative to currently available techniques enables the x-ray imaging of tumors that are just a few millimeters in diameter and substantially reduces the amount of nanoparticle contrast agent required for intravenous injection relative to absorption-based x-ray imaging.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
Clover: Compiler directed lightweight soft error resilience
Liu, Qingrui; Lee, Dongyoon; Jung, Changhee; ...
2015-05-01
This paper presents Clover, a compiler directed soft error detection and recovery scheme for lightweight soft error resilience. The compiler carefully generates soft error tolerant code based on idem-potent processing without explicit checkpoint. During program execution, Clover relies on a small number of acoustic wave detectors deployed in the processor to identify soft errors by sensing the wave made by a particle strike. To cope with DUE (detected unrecoverable errors) caused by the sensing latency of error detection, Clover leverages a novel selective instruction duplication technique called tail-DMR (dual modular redundancy). Once a soft error is detected by either themore » sensor or the tail-DMR, Clover takes care of the error as in the case of exception handling. To recover from the error, Clover simply redirects program control to the beginning of the code region where the error is detected. Lastly, the experiment results demonstrate that the average runtime overhead is only 26%, which is a 75% reduction compared to that of the state-of-the-art soft error resilience technique.« less
Mapping Base Modifications in DNA by Transverse-Current Sequencing
NASA Astrophysics Data System (ADS)
Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.
2018-02-01
Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.
Ge, Ji; Wang, YaoNan; Zhou, BoWen; Zhang, Hui
2009-01-01
A biologically inspired spiking neural network model, called pulse-coupled neural networks (PCNN), has been applied in an automatic inspection machine to detect visible foreign particles intermingled in glucose or sodium chloride injection liquids. Proper mechanisms and improved spin/stop techniques are proposed to avoid the appearance of air bubbles, which increases the algorithms' complexity. Modified PCNN is adopted to segment the difference images, judging the existence of foreign particles according to the continuity and smoothness properties of their moving traces. Preliminarily experimental results indicate that the inspection machine can detect the visible foreign particles effectively and the detection speed, accuracy and correct detection rate also satisfying the needs of medicine preparation. PMID:22412318
Band-Limited Masks and Direct Imaging of Exoplanets
NASA Technical Reports Server (NTRS)
Kuchner, Marc J.
2009-01-01
Band-limited masks have become the baseline design for what is now called "classical TPF" and also the N|RCamcomnagraphonJW8 .This technology remains one of the most promising paths for direct detection ofmxop|anedm and disks. I'll describe some of the latest progress in the implementation of this technique and what we have learned about where it can and can not be effectively applied.
NASA Technical Reports Server (NTRS)
Rendleman, R. A.; Champagne, E. B.; Ferris, J. E.; Liskow, C. L.; Marks, J. M.; Salmer, R. J.
1974-01-01
Development of a dual polarized L-band radar imaging system to be used in conjunction with the present dual polarized X-band radar is described. The technique used called for heterodyning the transmitted frequency from X-band to L-band and again heterodyning the received L-band signals back to X-band for amplification, detection, and recording.
Anomaly-Based Intrusion Detection Systems Utilizing System Call Data
2012-03-01
Functionality Description Persistence mechanism Mimicry technique Camouflage malware image: • renaming its image • appending its image to victim...particular industrial plant . Exactly which one was targeted still remains unknown, however a majority of the attacks took place in Iran [24]. Due... plant to unstable phase and eventually physical damage. It is interesting to note that a particular block of code - block DB8061 is automatically
NASA Astrophysics Data System (ADS)
Benalcazar, Wladimir A.; Jiang, Zhi; Marks, Daniel L.; Geddes, Joseph B.; Boppart, Stephen A.
2009-02-01
We validate a molecular imaging technique called Nonlinear Interferometric Vibrational Imaging (NIVI) by comparing vibrational spectra with those acquired from Raman microscopy. This broadband coherent anti-Stokes Raman scattering (CARS) technique uses heterodyne detection and OCT acquisition and design principles to interfere a CARS signal generated by a sample with a local oscillator signal generated separately by a four-wave mixing process. These are mixed and demodulated by spectral interferometry. Its confocal configuration allows the acquisition of 3D images based on endogenous molecular signatures. Images from both phantom and mammary tissues have been acquired by this instrument and its spectrum is compared with its spontaneous Raman signatures.
NASA Technical Reports Server (NTRS)
Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.
1999-01-01
Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.
Current trends in gamma radiation detection for radiological emergency response
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Guss, Paul; Maurer, Richard
2011-09-01
Passive and active detection of gamma rays from shielded radioactive materials, including special nuclear materials, is an important task for any radiological emergency response organization. This article reports on the current trends and status of gamma radiation detection objectives and measurement techniques as applied to nonproliferation and radiological emergencies. In recent years, since the establishment of the Domestic Nuclear Detection Office by the Department of Homeland Security, a tremendous amount of progress has been made in detection materials (scintillators, semiconductors), imaging techniques (Compton imaging, use of active masking and hybrid imaging), data acquisition systems with digital signal processing, field programmable gate arrays and embedded isotopic analysis software (viz. gamma detector response and analysis software [GADRAS]1), fast template matching, and data fusion (merging radiological data with geo-referenced maps, digital imagery to provide better situational awareness). In this stride to progress, a significant amount of inter-disciplinary research and development has taken place-techniques and spin-offs from medical science (such as x-ray radiography and tomography), materials engineering (systematic planned studies on scintillators to optimize several qualities of a good scintillator, nanoparticle applications, quantum dots, and photonic crystals, just to name a few). No trend analysis of radiation detection systems would be complete without mentioning the unprecedented strategic position taken by the National Nuclear Security Administration (NNSA) to deter, detect, and interdict illicit trafficking in nuclear and other radioactive materials across international borders and through the global maritime transportation-the so-called second line of defense.
Early Breast Cancer Diagnosis Using Microwave Imaging via Space-Frequency Algorithm
NASA Astrophysics Data System (ADS)
Vemulapalli, Spandana
The conventional breast cancer detection methods have limitations ranging from ionizing radiations, low specificity to high cost. These limitations make way for a suitable alternative called Microwave Imaging, as a screening technique in the detection of breast cancer. The discernible differences between the benign, malignant and healthy breast tissues and the ability to overcome the harmful effects of ionizing radiations make microwave imaging, a feasible breast cancer detection technique. Earlier studies have shown the variation of electrical properties of healthy and malignant tissues as a function of frequency and hence stimulates high bandwidth requirement. A Ultrawideband, Wideband and Narrowband arrays have been designed, simulated and optimized for high (44%), medium (33%) and low (7%) bandwidths respectively, using the EM (electromagnetic software) called FEKO. These arrays are then used to illuminate the breast model (phantom) and the received backscattered signals are obtained in the near field for each case. The Microwave Imaging via Space-Time (MIST) beamforming algorithm in the frequency domain, is next applied to these near field backscattered monostatic frequency response signals for the image reconstruction of the breast model. The main purpose of this investigation is to access the impact of bandwidth and implement a novel imaging technique for use in the early detection of breast cancer. Earlier studies show the implementation of the MIST imaging algorithm on the time domain signals via a frequency domain beamformer. The performance evaluation of the imaging algorithm on the frequency response signals has been carried out in the frequency domain. The energy profile of the breast in the spatial domain is created via the frequency domain Parseval's theorem. The beamformer weights calculated using these the MIST algorithm (not including the effect of the skin) has been calculated for Ultrawideband, Wideband and Narrowband arrays, respectively. Quality metrics such as dynamic range, radiometric resolution etc. are also evaluated for all the three types of arrays.
NASA Technical Reports Server (NTRS)
Jackson, W. M.
1977-01-01
A tunable vacuum ultraviolet flash lamp was constructed. This unique flash lamp was coupled with a tunable dye laser detector and permits the experimenter to measure the production rates of ground state radicals as a function of wavelength. A new technique for producing fluorescent radicals was discovered. This technique called multiphoton ultraviolet photodissociation is currently being applied to several problems of both cometary and stratospheric interest. It was demonstrated that NO2 will dissociate to produce an excited fragment and the radiation can possibly be used for remote detection of this species.
Visual analytics of anomaly detection in large data streams
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay
2009-01-01
Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.
Spectral domain phase microscopy: a new tool for measuring cellular dynamics and cytoplasmic flow
NASA Astrophysics Data System (ADS)
McDowell, Emily J.; Choma, Michael A.; Ellerbee, Audrey K.; Izatt, Joseph A.
2005-03-01
Broadband interferometry is an attractive technique for the detection of cellular motions because it provides depth-resolved interferometric phase information via coherence gating. Here a phase sensitive technique called spectral domain phase microscopy (SDPM) is presented. SDPM is a functional extension of spectral domain optical coherence tomography that allows for the detection of cellular motions and dynamics with nanometer-scale sensitivity. This sensitivity is made possible by the inherent phase stability of spectral domain OCT combined with common-path interferometry. The theory that underlies this technique is presented, the sensitivity of the technique is demonstrated by the measurement of the thermal expansion coefficient of borosilicate glass, and the response of an Amoeba proteus to puncture of its cell membrane is measured. We also exploit the phase stability of SDPM to perform Doppler flow imaging of cytoplasmic streaming in A. proteus. We show reversal of cytoplasmic flow in response to stimuli, and we show that the cytoplasmic flow is laminar (i.e. parabolic) in nature. We are currently investigating the use of SDPM in a variety of different cell types.
Carter, Erik P; Seymour, Elif Ç; Scherr, Steven M; Daaboul, George G; Freedman, David S; Selim Ünlü, M; Connor, John H
2017-01-01
This chapter describes an approach for the label-free imaging and quantification of intact Ebola virus (EBOV) and EBOV viruslike particles (VLPs) using a light microscopy technique. In this technique, individual virus particles are captured onto a silicon chip that has been printed with spots of virus-specific capture antibodies. These captured virions are then detected using an optical approach called interference reflectance imaging. This approach allows for the detection of each virus particle that is captured on an antibody spot and can resolve the filamentous structure of EBOV VLPs without the need for electron microscopy. Capture of VLPs and virions can be done from a variety of sample types ranging from tissue culture medium to blood. The technique also allows automated quantitative analysis of the number of virions captured. This can be used to identify the virus concentration in an unknown sample. In addition, this technique offers the opportunity to easily image virions captured from native solutions without the need for additional labeling approaches while offering a means of assessing the range of particle sizes and morphologies in a quantitative manner.
A smart technique for attendance system to recognize faces through parallelism
NASA Astrophysics Data System (ADS)
Prabhavathi, B.; Tanuja, V.; Madhu Viswanatham, V.; Rajashekhara Babu, M.
2017-11-01
Major part of recognising a person is face with the help of image processing techniques we can exploit the physical features of a person. In the old approach method that is used in schools and colleges it is there that the professor calls the student name and then the attendance for the students marked. Here in paper want to deviate from the old approach and go with the new approach by using techniques that are there in image processing. In this paper we presenting spontaneous presence for students in classroom. At first classroom image has been in use and after that image is kept in data record. For the images that are stored in the database we apply system algorithm which includes steps such as, histogram classification, noise removal, face detection and face recognition methods. So by using these steps we detect the faces and then compare it with the database. The attendance gets marked automatically if the system recognizes the faces.
Learning directed acyclic graphs from large-scale genomics data.
Nikolay, Fabio; Pesavento, Marius; Kritikos, George; Typas, Nassos
2017-09-20
In this paper, we consider the problem of learning the genetic interaction map, i.e., the topology of a directed acyclic graph (DAG) of genetic interactions from noisy double-knockout (DK) data. Based on a set of well-established biological interaction models, we detect and classify the interactions between genes. We propose a novel linear integer optimization program called the Genetic-Interactions-Detector (GENIE) to identify the complex biological dependencies among genes and to compute the DAG topology that matches the DK measurements best. Furthermore, we extend the GENIE program by incorporating genetic interaction profile (GI-profile) data to further enhance the detection performance. In addition, we propose a sequential scalability technique for large sets of genes under study, in order to provide statistically significant results for real measurement data. Finally, we show via numeric simulations that the GENIE program and the GI-profile data extended GENIE (GI-GENIE) program clearly outperform the conventional techniques and present real data results for our proposed sequential scalability technique.
Robust detection of chromosomal interactions from small numbers of cells using low-input Capture-C
Oudelaar, A. Marieke; Davies, James O.J.; Downes, Damien J.; Higgs, Douglas R.
2017-01-01
Abstract Chromosome conformation capture (3C) techniques are crucial to understanding tissue-specific regulation of gene expression, but current methods generally require large numbers of cells. This hampers the investigation of chromatin architecture in rare cell populations. We present a new low-input Capture-C approach that can generate high-quality 3C interaction profiles from 10 000–20 000 cells, depending on the resolution used for analysis. We also present a PCR-free, sequencing-free 3C technique based on NanoString technology called C-String. By comparing C-String and Capture-C interaction profiles we show that the latter are not skewed by PCR amplification. Furthermore, we demonstrate that chromatin interactions detected by Capture-C do not depend on the degree of cross-linking by performing experiments with varying formaldehyde concentrations. PMID:29186505
GNSS Signal Authentication Via Power and Distortion Monitoring
NASA Astrophysics Data System (ADS)
Wesson, Kyle D.; Gross, Jason N.; Humphreys, Todd E.; Evans, Brian L.
2018-04-01
We propose a simple low-cost technique that enables civil Global Positioning System (GPS) receivers and other civil global navigation satellite system (GNSS) receivers to reliably detect carry-off spoofing and jamming. The technique, which we call the Power-Distortion detector, classifies received signals as interference-free, multipath-afflicted, spoofed, or jammed according to observations of received power and correlation function distortion. It does not depend on external hardware or a network connection and can be readily implemented on many receivers via a firmware update. Crucially, the detector can with high probability distinguish low-power spoofing from ordinary multipath. In testing against over 25 high-quality empirical data sets yielding over 900,000 separate detection tests, the detector correctly alarms on all malicious spoofing or jamming attacks while maintaining a <0.6% single-channel false alarm rate.
Combinatorial pulse position modulation for power-efficient free-space laser communications
NASA Technical Reports Server (NTRS)
Budinger, James M.; Vanderaar, M.; Wagner, P.; Bibyk, Steven
1993-01-01
A new modulation technique called combinatorial pulse position modulation (CPPM) is presented as a power-efficient alternative to quaternary pulse position modulation (QPPM) for direct-detection, free-space laser communications. The special case of 16C4PPM is compared to QPPM in terms of data throughput and bit error rate (BER) performance for similar laser power and pulse duty cycle requirements. The increased throughput from CPPM enables the use of forward error corrective (FEC) encoding for a net decrease in the amount of laser power required for a given data throughput compared to uncoded QPPM. A specific, practical case of coded CPPM is shown to reduce the amount of power required to transmit and receive a given data sequence by at least 4.7 dB. Hardware techniques for maximum likelihood detection and symbol timing recovery are presented.
Call progress time measurement in IP telephony
NASA Astrophysics Data System (ADS)
Khasnabish, Bhumip
1999-11-01
Usually a voice call is established through multiple stages in IP telephony. In the first stage, a phone number is dialed to reach a near-end or call-originating IP-telephony gateway. The next stages involve user identification through delivering an m-digit user-id to the authentication and/or billing server, and then user authentication by using an n- digit PIN. After that, the caller is allowed (last stage dial tone is provided) to dial a destination phone number provided that authentication is successful. In this paper, we present a very flexible method for measuring call progress time in IP telephony. The proposed technique can be used to measure the system response time at every stage. It is flexible, so that it can be easily modified to include new `tone' or a set of tones, or `voice begin' can be used in every stage to detect the system's response. The proposed method has been implemented using scripts written in Hammer visual basic language for testing with a few commercially available IP telephony gateways.
Phase-detected Brillouin optical correlation-domain reflectometry
NASA Astrophysics Data System (ADS)
Mizuno, Yosuke; Hayashi, Neisei; Fukuda, Hideyuki; Nakamura, Kentaro
2018-05-01
Optical fiber sensing techniques based on Brillouin scattering have been extensively studied for structural health monitoring owing to their capability of distributed strain and temperature measurement. Although a higher signal-to-noise ratio (leading to high spatial resolution and high-speed measurement) is generally obtained for two-end-access systems, they reduce the degree of freedom in embedding the sensors into structures, and render the measurement no longer feasible when extremely high loss or breakage occurs at a point of the sensing fiber. To overcome these drawbacks, a one-end-access sensing technique called Brillouin optical correlation-domain reflectometry (BOCDR) has been developed. BOCDR has a high spatial resolution and cost efficiency, but its conventional configuration suffered from relatively low-speed operation. In this paper, we review the recently developed high-speed configurations of BOCDR, including phase-detected BOCDR, with which we demonstrate real-time distributed measurement by tracking a propagating mechanical wave. We also demonstrate breakage detection with a wide strain dynamic range.
A modular positron camera for the study of industrial processes
NASA Astrophysics Data System (ADS)
Leadbeater, T. W.; Parker, D. J.
2011-10-01
Positron imaging techniques rely on the detection of the back-to-back annihilation photons arising from positron decay within the system under study. A standard technique, called positron emitting particle tracking (PEPT) [1], uses a number of these detected events to rapidly determine the position of a positron emitting tracer particle introduced into the system under study. Typical applications of PEPT are in the study of granular and multi-phase materials in the disciplines of engineering and the physical sciences. Using components from redundant medical PET scanners a modular positron camera has been developed. This camera consists of a number of small independent detector modules, which can be arranged in custom geometries tailored towards the application in question. The flexibility of the modular camera geometry allows for high photon detection efficiency within specific regions of interest, the ability to study large and bulky systems and the application of PEPT to difficult or remote processes as the camera is inherently transportable.
Spatio-temporal Outlier Detection in Precipitation Data
NASA Astrophysics Data System (ADS)
Wu, Elizabeth; Liu, Wei; Chawla, Sanjay
The detection of outliers from spatio-temporal data is an important task due to the increasing amount of spatio-temporal data available and the need to understand and interpret it. Due to the limitations of current data mining techniques, new techniques to handle this data need to be developed. We propose a spatio-temporal outlier detection algorithm called Outstretch, which discovers the outlier movement patterns of the top-k spatial outliers over several time periods. The top-k spatial outliers are found using the Exact-Grid Top- k and Approx-Grid Top- k algorithms, which are an extension of algorithms developed by Agarwal et al. [1]. Since they use the Kulldorff spatial scan statistic, they are capable of discovering all outliers, unaffected by neighbouring regions that may contain missing values. After generating the outlier sequences, we show one way they can be interpreted, by comparing them to the phases of the El Niño Southern Oscilliation (ENSO) weather phenomenon to provide a meaningful analysis of the results.
Phase-detected Brillouin optical correlation-domain reflectometry
NASA Astrophysics Data System (ADS)
Mizuno, Yosuke; Hayashi, Neisei; Fukuda, Hideyuki; Nakamura, Kentaro
2018-06-01
Optical fiber sensing techniques based on Brillouin scattering have been extensively studied for structural health monitoring owing to their capability of distributed strain and temperature measurement. Although a higher signal-to-noise ratio (leading to high spatial resolution and high-speed measurement) is generally obtained for two-end-access systems, they reduce the degree of freedom in embedding the sensors into structures, and render the measurement no longer feasible when extremely high loss or breakage occurs at a point of the sensing fiber. To overcome these drawbacks, a one-end-access sensing technique called Brillouin optical correlation-domain reflectometry (BOCDR) has been developed. BOCDR has a high spatial resolution and cost efficiency, but its conventional configuration suffered from relatively low-speed operation. In this paper, we review the recently developed high-speed configurations of BOCDR, including phase-detected BOCDR, with which we demonstrate real-time distributed measurement by tracking a propagating mechanical wave. We also demonstrate breakage detection with a wide strain dynamic range.
An introduction to DARC technology.
Ahmad, Syed Shoeb
2017-01-01
Glaucoma is a multi-factorial neurodegenerative disorder. The common denominator in all types of glaucomas is retinal ganglion cell death through apoptosis. However, this cellular demise in glaucoma is detected late by structural or functional analyses. There can be a 10-year delay prior to the appearance of visual field defects and pre-perimetric glaucoma is an issue still being addressed. However, a new cutting-edge technology called detection of apoptosing retinal cells (DARC) is being developed. This technique is capable of non-invasive, real-time visualization of apoptotic changes at the cellular level. It can detect glaucomatous cell damage at a very early stage, at the moment apoptosis starts, and thus management can be initiated even prior to development of visual field changes. In future, this technique will also be able to provide conclusive evidence of the effectiveness of treatment protocol and the need for any modifications which may be required. This article aims to provide a concise review of DARC technology.
Weak signal detection: A discrete window of opportunity for achieving 'Vision 90:90:90'?
Burman, Christopher J; Aphane, Marota; Delobelle, Peter
2016-01-01
UNAIDS' Vision 90:90:90 is a call to 'end AIDS'. Developing predictive foresight of the unpredictable changes that this journey will entail could contribute to the ambition of 'ending AIDS'. There are few opportunities for managing unpredictable changes. We introduce 'weak signal detection' as a potential opportunity to fill this void. Combining futures and complexity theory, we reflect on two pilot case studies that involved the Archetype Extraction technique and the SenseMaker(®) Collector(™) tool. Both the piloted techniques have the potentials to surface weak signals--but there is room for improvement. A management response to a complex weak signal requires pattern management, rather than an exclusive focus on behaviour management. Weak signal detection is a window of opportunity to improve resilience to unpredictable changes in the HIV/AIDS landscape that can both reduce the risk that emerges from the changes and increase the visibility of opportunities to exploit the unpredictable changes that could contribute to 'ending AIDS'.
New fluorescence techniques for high-throughput drug discovery.
Jäger, S; Brand, L; Eggeling, C
2003-12-01
The rapid increase of compound libraries as well as new targets emerging from the Human Genome Project require constant progress in pharmaceutical research. An important tool is High-Throughput Screening (HTS), which has evolved as an indispensable instrument in the pre-clinical target-to-IND (Investigational New Drug) discovery process. HTS requires machinery, which is able to test more than 100,000 potential drug candidates per day with respect to a specific biological activity. This calls for certain experimental demands especially with respect to sensitivity, speed, and statistical accuracy, which are fulfilled by using fluorescence technology instrumentation. In particular the recently developed family of fluorescence techniques, FIDA (Fluorescence Intensity Distribution Analysis), which is based on confocal single-molecule detection, has opened up a new field of HTS applications. This report describes the application of these new techniques as well as of common fluorescence techniques--such as confocal fluorescence lifetime and anisotropy--to HTS. It gives experimental examples and presents advantages and disadvantages of each method. In addition the most common artifacts (auto-fluorescence or quenching by the drug candidates) emerging from the fluorescence detection techniques are highlighted and correction methods for confocal fluorescence read-outs are presented, which are able to circumvent this deficiency.
NASA Astrophysics Data System (ADS)
Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.
2016-05-01
Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.
NASA Turbulence Technologies In-Service Evaluation: Delta Air Lines Report-Out
NASA Technical Reports Server (NTRS)
Amaral, Christian; Dickson, Steve; Watts, Bill
2007-01-01
Concluding an in-service evaluation of two new turbulence detection technologies developed in the Turbulence Prediction and Warning Systems (TPAWS) element of the NASA Aviation Safety and Security Program's Weather Accident Prevention Project (WxAP), this report documents Delta's experience working with the technologies, feedback gained from pilots and dispatchers concerning current turbulence techniques and procedures, and Delta's recommendations regarding directions for further efforts by the research community. Technologies evaluated included an automatic airborne turbulence encounter reporting technology called the Turbulence Auto PIREP System (TAPS), and a significant enhancement to the ability of modern airborne weather radars to predict and display turbulence of operational significance, called E-Turb radar.
Convolutional neural network for earthquake detection and location
Perol, Thibaut; Gharbi, Michaël; Denolle, Marine
2018-01-01
The recent evolution of induced seismicity in Central United States calls for exhaustive catalogs to improve seismic hazard assessment. Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today’s most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. We leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for earthquake detection and location from a single waveform. We apply our technique to study the induced seismicity in Oklahoma, USA. We detect more than 17 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm is orders of magnitude faster than established methods. PMID:29487899
Fabrication and testing of a standoff trace explosives detection system
NASA Astrophysics Data System (ADS)
Waterbury, Robert; Rose, Jeremy; Vunck, Darius; Blank, Thomas; Pohl, Ken; Ford, Alan; McVay, Troy; Dottery, Ed
2011-05-01
In order to stop the transportation of materials used for IED manufacture, a standoff checkpoint explosives detection system (CPEDS) has recently been fabricated. The system incorporates multi-wavelength Raman spectroscopy and laser induced breakdown spectroscopy (LIBS) modalities with a LIBS enhancement technique called TEPS to be added later into a single unit for trace detection of explosives at military checkpoints. Newly developed spectrometers and other required sensors all integrated with a custom graphical user interface for producing simplified, real-time detection results are also included in the system. All equipment is housed in a military ruggedized shelter for potential deployment intheater for signature collection. Laboratory and performance data, as well as the construction of the CPEDS system and its potential deployment capabilities, will be presented in the current work.
Point counts from clustered populations: Lessons from an experiment with Hawaiian crows
Hayward, G.D.; Kepler, C.B.; Scott, J.M.
1991-01-01
We designed an experiment to identify factors contributing most to error in counts of Hawaiian Crow or Alala (Corvus hawaiiensis) groups that are detected aurally. Seven observers failed to detect calling Alala on 197 of 361 3-min point counts on four transects extending from cages with captive Alala. A detection curve describing the relation between frequency of flock detection and distance typified the distribution expected in transect or point counts. Failure to detect calling Alala was affected most by distance, observer, and Alala calling frequency. The number of individual Alala calling was not important in detection rate. Estimates of the number of Alala calling (flock size) were biased and imprecise: average difference between number of Alala calling and number heard was 3.24 (.+-. 0.277). Distance, observer, number of Alala calling, and Alala calling frequency all contributed to errors in estimates of group size (P < 0.0001). Multiple regression suggested that number of Alala calling contributed most to errors. These results suggest that well-designed point counts may be used to estimate the number of Alala flocks but cast doubt on attempts to estimate flock size when individuals are counted aurally.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Validation of a new technique to detect Cryptosporidium spp. oocysts in bovine feces.
Inácio, Sandra Valéria; Gomes, Jancarlo Ferreira; Oliveira, Bruno César Miranda; Falcão, Alexandre Xavier; Suzuki, Celso Tetsuo Nagase; Dos Santos, Bianca Martins; de Aquino, Monally Conceição Costa; de Paula Ribeiro, Rafaela Silva; de Assunção, Danilla Mendes; Casemiro, Pamella Almeida Freire; Meireles, Marcelo Vasconcelos; Bresciani, Katia Denise Saraiva
2016-11-01
Due to its important zoonotic potential, cryptosporidiosis arouses strong interest in the scientific community, because, it was initially considered a rare and opportunistic disease. The parasitological diagnosis of the causative agent of this disease, the protozoan Cryptosporidium spp., requires the use of specific techniques of concentration and permanent staining, which are laborious and costly, and are difficult to use in routine laboratory tests. In view of the above, we conducted the feasibility, development, evaluation and intralaboratory validation of a new parasitological technique for analysis in optical microscopy of Cryptosporidium spp. oocysts, called TF-Test Coccidia, using fecal samples from calves from the city of Araçatuba, São Paulo. To confirm the aforementioned parasite and prove the diagnostic efficiency of the new technique, we used two established methodologies in the scientific literature: parasite concentration by centrifugal sedimentation and negative staining with malachite green (CSN-Malachite) and Nested-PCR. We observed good effectiveness of the TF-Test Coccidia technique, being statistically equivalent to CSN-Malachite. Thus, we verified the effectiveness of the TF-Test Coccidia parasitological technique for the detection of Cryptosporidium spp. oocysts and observed good concentration and morphology of the parasite, with a low amount of debris in the fecal smear. Copyright © 2016 Elsevier B.V. All rights reserved.
Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean.
Stafford, K M; Fox, C G; Clark, D S
1998-12-01
Analysis of acoustic signals recorded from the U.S. Navy's SOund SUrveillance System (SOSUS) was used to detect and locate blue whale (Balaenoptera musculus) calls offshore in the northeast Pacific. The long, low-frequency components of these calls are characteristic of calls recorded in the presence of blue whales elsewhere in the world. Mean values for frequency and time characteristics from field-recorded blue whale calls were used to develop a simple matched filter for detecting such calls in noisy time series. The matched filter was applied to signals from three different SOSUS arrays off the coast of the Pacific Northwest to detect and associate individual calls from the same animal on the different arrays. A U.S. Navy maritime patrol aircraft was directed to an area where blue whale calls had been detected on SOSUS using these methods, and the presence of vocalizing blue whale was confirmed at the site with field recordings from sonobuoys.
Development of a Digital Microarray with Interferometric Reflectance Imaging
NASA Astrophysics Data System (ADS)
Sevenler, Derin
This dissertation describes a new type of molecular assay for nucleic acids and proteins. We call this technique a digital microarray since it is conceptually similar to conventional fluorescence microarrays, yet it performs enumerative ('digital') counting of the number captured molecules. Digital microarrays are approximately 10,000-fold more sensitive than fluorescence microarrays, yet maintain all of the strengths of the platform including low cost and high multiplexing (i.e., many different tests on the same sample simultaneously). Digital microarrays use gold nanorods to label the captured target molecules. Each gold nanorod on the array is individually detected based on its light scattering, with an interferometric microscopy technique called SP-IRIS. Our optimized high-throughput version of SP-IRIS is able to scan a typical array of 500 spots in less than 10 minutes. Digital DNA microarrays may have utility in applications where sequencing is prohibitively expensive or slow. As an example, we describe a digital microarray assay for gene expression markers of bacterial drug resistance.
Spotlight-Mode Synthetic Aperture Radar Processing for High-Resolution Lunar Mapping
NASA Technical Reports Server (NTRS)
Harcke, Leif; Weintraub, Lawrence; Yun, Sang-Ho; Dickinson, Richard; Gurrola, Eric; Hensley, Scott; Marechal, Nicholas
2010-01-01
During the 2008-2009 year, the Goldstone Solar System Radar was upgraded to support radar mapping of the lunar poles at 4 m resolution. The finer resolution of the new system and the accompanying migration through resolution cells called for spotlight, rather than delay-Doppler, imaging techniques. A new pre-processing system supports fast-time Doppler removal and motion compensation to a point. Two spotlight imaging techniques which compensate for phase errors due to i) out of focus-plane motion of the radar and ii) local topography, have been implemented and tested. One is based on the polar format algorithm followed by a unique autofocus technique, the other is a full bistatic time-domain backprojection technique. The processing system yields imagery of the specified resolution. Products enabled by this new system include topographic mapping through radar interferometry, and change detection techniques (amplitude and coherent change) for geolocation of the NASA LCROSS mission impact site.
GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D
This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less
Excess wing in glass-forming glycerol and LiCl-glycerol mixtures detected by neutron scattering
Gupta, S.; Arend, N.; Lunkenheimer, P.; ...
2015-01-22
The relaxational dynamics in glass-forming glycerol and glycerol mixed with LiCl is investigated using different neutron scattering techniques. The performed neutron spin echo experiments, which extend up to relatively long relaxation time scales of the order of 10 ns, should allow for the detection of contributions from the so-called excess wing. This phenomenon, whose microscopic origin is controversially discussed, arises in a variety of glass formers and, until now, was almost exclusively investigated by dielectric spectroscopy and light scattering. In conclusion, we show here that the relaxational process causing the excess wing can also be detected by neutron scattering, whichmore » directly couples to density fluctuations.« less
Integration of Audit Data Analysis and Mining Techniques into Aide
2006-07-01
results produced by the anomaly-detection subsystem. A successor system to NIDES, called EMERALD [35], currently under development at SRI, extends...to represent attack scenarios in a networked environment. eBayes of SRI’s Emerald uses Bayes net technology to analyze bursts of traffic [40...snmpget2, we have to resort to the TCP raw data (packets) to see what operations these connections performed. (E.g., long login trails in the PSSWD attack
Turnover of Lipidated LC3 and Autophagic Cargoes in Mammalian Cells.
Rodríguez-Arribas, M; Yakhine-Diop, S M S; González-Polo, R A; Niso-Santano, M; Fuentes, J M
2017-01-01
Macroautophagy (usually referred to as autophagy) is the most important degradation system in mammalian cells. It is responsible for the elimination of protein aggregates, organelles, and other cellular content. During autophagy, these materials (i.e., cargo) must be engulfed by a double-membrane structure called an autophagosome, which delivers the cargo to the lysosome to complete its degradation. Autophagy is a very dynamic pathway called autophagic flux. The process involves all the steps that are implicated in cargo degradation from autophagosome formation. There are several techniques to monitor autophagic flux. Among them, the method most used experimentally to assess autophagy is the detection of LC3 protein processing and p62 degradation by Western blotting. In this chapter, we provide a detailed and straightforward protocol for this purpose in cultured mammalian cells, including a brief set of notes concerning problems associated with the Western-blotting detection of LC3 and p62. © 2017 Elsevier Inc. All rights reserved.
Detection of Cutting Tool Wear using Statistical Analysis and Regression Model
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin
2010-10-01
This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.
X-ray ‘ghost images’ could cut radiation doses
NASA Astrophysics Data System (ADS)
Chen, Sophia
2018-03-01
On its own, a single-pixel camera captures pictures that are pretty dull: squares that are completely black, completely white, or some shade of gray in between. All it does, after all, is detect brightness. Yet by connecting a single-pixel camera to a patterned light source, a team of physicists in China has made detailed x-ray images using a statistical technique called ghost imaging, first pioneered 20 years ago in infrared and visible light. Researchers in the field say future versions of this system could take clear x-ray photographs with cheap cameras—no need for lenses and multipixel detectors—and less cancer-causing radiation than conventional techniques.
Endoscopic ultrasound: Elastographic lymph node evaluation.
Dietrich, Christoph F; Jenssen, Christian; Arcidiacono, Paolo G; Cui, Xin-Wu; Giovannini, Marc; Hocke, Michael; Iglesias-Garcia, Julio; Saftoiu, Adrian; Sun, Siyu; Chiorean, Liliana
2015-01-01
Different imaging techniques can bring different information which will contribute to the final diagnosis and further management of the patients. Even from the time of Hippocrates, palpation has been used in order to detect and characterize a body mass. The so-called virtual palpation has now become a reality due to elastography, which is a recently developed technique. Elastography has already been proving its added value as a complementary imaging method, helpful to better characterize and differentiate between benign and malignant masses. The current applications of elastography in lymph nodes (LNs) assessment by endoscopic ultrasonography will be further discussed in this paper, with a review of the literature and future perspectives.
Wavelength modulated surface enhanced (resonance) Raman scattering for background-free detection.
Praveen, Bavishna B; Steuwe, Christian; Mazilu, Michael; Dholakia, Kishan; Mahajan, Sumeet
2013-05-21
Spectra in surface-enhanced Raman scattering (SERS) are always accompanied by a continuum emission called the 'background' which complicates analysis and is especially problematic for quantification and automation. Here, we implement a wavelength modulation technique to eliminate the background in SERS and its resonant version, surface-enhanced resonance Raman scattering (SERRS). This is demonstrated on various nanostructured substrates used for SER(R)S. An enhancement in the signal to noise ratio for the Raman bands of the probe molecules is also observed. This technique helps to improve the analytical ability of SERS by alleviating the problem due to the accompanying background and thus making observations substrate independent.
NASA Technical Reports Server (NTRS)
Sidney, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.;
2014-01-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiralonly signals from compact binary systems with a total mass of equal to or less than 20M solar mass and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor approx. equals 20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor approx. equals 1000 longer processing time.
NASA Astrophysics Data System (ADS)
Sidery, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.; Kalogera, V.; Mandel, I.; O'Shaughnessy, R.; Pitkin, M.; Price, L.; Raymond, V.; Röver, C.; Singer, L.; van der Sluys, M.; Smith, R. J. E.; Vecchio, A.; Veitch, J.; Vitale, S.
2014-04-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiral-only signals from compact binary systems with a total mass of ≤20M⊙ and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor ≈20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor ≈1000 longer processing time.
An EMAT-based shear horizontal (SH) wave technique for adhesive bond inspection
NASA Astrophysics Data System (ADS)
Arun, K.; Dhayalan, R.; Balasubramaniam, Krishnan; Maxfield, Bruce; Peres, Patrick; Barnoncel, David
2012-05-01
The evaluation of adhesively bonded structures has been a challenge over the several decades that these structures have been used. Applications within the aerospace industry often call for particularly high performance adhesive bonds. Several techniques have been proposed for the detection of disbonds and cohesive weakness but a reliable NDE method for detecting interfacial weakness (also sometimes called a kissing bond) has been elusive. Different techniques, including ultrasonic, thermal imaging and shearographic methods, have been proposed; all have had some degree of success. In particular, ultrasonic methods, including those based upon shear and guided waves, have been explored for the assessment of interfacial bond quality. Since 3-D guided shear horizontal (SH) waves in plates have predominantly shear displacement at the plate surfaces, we conjectured that SH guided waves should be influenced by interfacial conditions when they propagate between adhesively bonded plates of comparable thickness. This paper describes a new technique based on SH guided waves that propagate within and through a lap joint. Through mechanisms we have yet to fully understand, the propagation of an SH wave through a lap joint gives rise to a reverberation signal that is due to one or more reflections of an SH guided wave mode within that lap joint. Based upon a combination of numerical simulations and measurements, this method shows promise for detecting and classifying interfacial bonds. It is also apparent from our measurements that the SH wave modes can discriminate between adhesive and cohesive bond weakness in both Aluminum-Epoxy-Aluminum and Composite-Epoxy-Composite lap joints. All measurements reported here used periodic permanent magnet (PPM) Electro-Magnetic Acoustic Transducers (EMATs) to generate either or both of the two lowest order SH modes in the plates that comprise the lap joint. This exact configuration has been simulated using finite element (FE) models to describe the SH mode generation, propagation and reception. Of particular interest is that one SH guided wave mode (probably SH0) reverberates within the lap joint. Moreover, in both simulations and measurements, features of this so-called reverberation signal appear to be related to interfacial weakness between the plate (substrate) and the epoxy bond. The results of a hybrid numerical (FE) approach based on using COMSOL to calculate the driving forces within an elastic solid and ABAQUS to propagate the resulting elastic disturbances (waves) within the plates and lap joint are compared with measurements of SH wave generation and reception in lap joint specimens having different interfacial and cohesive bonding conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.; Arend, N.; Lunkenheimer, P.
The relaxational dynamics in glass-forming glycerol and glycerol mixed with LiCl is investigated using different neutron scattering techniques. The performed neutron spin echo experiments, which extend up to relatively long relaxation time scales of the order of 10 ns, should allow for the detection of contributions from the so-called excess wing. This phenomenon, whose microscopic origin is controversially discussed, arises in a variety of glass formers and, until now, was almost exclusively investigated by dielectric spectroscopy and light scattering. In conclusion, we show here that the relaxational process causing the excess wing can also be detected by neutron scattering, whichmore » directly couples to density fluctuations.« less
Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho
2013-01-01
Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428
Doppler imaging with dual-detection full-range frequency domain optical coherence tomography
Meemon, Panomsak; Lee, Kye-Sung; Rolland, Jannick P.
2010-01-01
Most of full-range techniques for Frequency Domain Optical Coherence Tomography (FD-OCT) reported to date utilize the phase relation between consecutive axial lines to reconstruct a complex interference signal and hence may exhibit degradation in either mirror image suppression performance or detectable velocity dynamic range or both when monitoring a moving sample such as flow activity. We have previously reported a technique of mirror image removal by simultaneous detection of the quadrature components of a complex spectral interference called a Dual-Detection Frequency Domain OCT (DD-FD-OCT) [Opt. Lett. 35, 1058-1060 (2010)]. The technique enables full range imaging without any loss of acquisition speed and is intrinsically less sensitive to phase errors generated by involuntary movements of the subject. In this paper, we demonstrate the application of the DD-FD-OCT to a phase-resolved Doppler imaging without degradation in either mirror image suppression performance or detectable velocity dynamic range that were observed in other full-range Doppler methods. In order to accommodate for Doppler imaging, we have developed a fiber-based DD-FD-OCT that more efficiently utilizes the source power compared with the previous free-space DD-FD-OCT. In addition, the velocity sensitivity of the phase-resolved DD-FD-OCT was investigated, and the relation between the measured Doppler phase shift and set flow velocity of a flow phantom was verified. Finally, we demonstrate the Doppler imaging using the DD-FD-OCT in a biological sample. PMID:21258488
Identification of PAH Isomeric Structure in Cosmic Dust Analogs: The AROMA Setup
NASA Astrophysics Data System (ADS)
Sabbah, Hassan; Bonnamy, Anthony; Papanastasiou, Dimitris; Cernicharo, Jose; Martín-Gago, Jose-Angel; Joblin, Christine
2017-07-01
We developed a new analytical experimental setup called AROMA (Astrochemistry Research of Organics with Molecular Analyzer) that combines laser desorption/ionization techniques with ion trap mass spectrometry. We report here on the ability of the apparatus to detect aromatic species in complex materials of astrophysical interest and characterize their structures. A limit of detection of 100 femto-grams has been achieved using pure polycyclic aromatic hydrocarbon (PAH) samples, which corresponds to 2 × 108 molecules in the case of coronene (C24H12). We detected the PAH distribution in the Murchison meteorite, which is made of a complex mixture of extraterrestrial organic compounds. In addition, collision induced dissociation experiments were performed on selected species detected in Murchison, which led to the first firm identification of pyrene and its methylated derivatives in this sample.
On splice site prediction using weight array models: a comparison of smoothing techniques
NASA Astrophysics Data System (ADS)
Taher, Leila; Meinicke, Peter; Morgenstern, Burkhard
2007-11-01
In most eukaryotic genes, protein-coding exons are separated by non-coding introns which are removed from the primary transcript by a process called "splicing". The positions where introns are cut and exons are spliced together are called "splice sites". Thus, computational prediction of splice sites is crucial for gene finding in eukaryotes. Weight array models are a powerful probabilistic approach to splice site detection. Parameters for these models are usually derived from m-tuple frequencies in trusted training data and subsequently smoothed to avoid zero probabilities. In this study we compare three different ways of parameter estimation for m-tuple frequencies, namely (a) non-smoothed probability estimation, (b) standard pseudo counts and (c) a Gaussian smoothing procedure that we recently developed.
A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.
Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut
2017-08-01
Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.
NanoFlares for the detection, isolation, and culture of live tumor cells from human blood.
Halo, Tiffany L; McMahon, Kaylin M; Angeloni, Nicholas L; Xu, Yilin; Wang, Wei; Chinen, Alyssa B; Malin, Dmitry; Strekalova, Elena; Cryns, Vincent L; Cheng, Chonghui; Mirkin, Chad A; Thaxton, C Shad
2014-12-02
Metastasis portends a poor prognosis for cancer patients. Primary tumor cells disseminate through the bloodstream before the appearance of detectable metastatic lesions. The analysis of cancer cells in blood—so-called circulating tumor cells (CTCs)—may provide unprecedented opportunities for metastatic risk assessment and investigation. NanoFlares are nanoconstructs that enable live-cell detection of intracellular mRNA. NanoFlares, when coupled with flow cytometry, can be used to fluorescently detect genetic markers of CTCs in the context of whole blood. They allow one to detect as few as 100 live cancer cells per mL of blood and subsequently culture those cells. This technique can also be used to detect CTCs in a murine model of metastatic breast cancer. As such, NanoFlares provide, to our knowledge, the first genetic-based approach for detecting, isolating, and characterizing live cancer cells from blood and may provide new opportunities for cancer diagnosis, prognosis, and personalized therapy.
NanoFlares for the detection, isolation, and culture of live tumor cells from human blood
Halo, Tiffany L.; McMahon, Kaylin M.; Angeloni, Nicholas L.; Xu, Yilin; Wang, Wei; Chinen, Alyssa B.; Malin, Dmitry; Strekalova, Elena; Cryns, Vincent L.; Cheng, Chonghui; Mirkin, Chad A.; Thaxton, C. Shad
2014-01-01
Metastasis portends a poor prognosis for cancer patients. Primary tumor cells disseminate through the bloodstream before the appearance of detectable metastatic lesions. The analysis of cancer cells in blood—so-called circulating tumor cells (CTCs)—may provide unprecedented opportunities for metastatic risk assessment and investigation. NanoFlares are nanoconstructs that enable live-cell detection of intracellular mRNA. NanoFlares, when coupled with flow cytometry, can be used to fluorescently detect genetic markers of CTCs in the context of whole blood. They allow one to detect as few as 100 live cancer cells per mL of blood and subsequently culture those cells. This technique can also be used to detect CTCs in a murine model of metastatic breast cancer. As such, NanoFlares provide, to our knowledge, the first genetic-based approach for detecting, isolating, and characterizing live cancer cells from blood and may provide new opportunities for cancer diagnosis, prognosis, and personalized therapy. PMID:25404304
NASA Astrophysics Data System (ADS)
Knicker, Heike
2016-04-01
During the last years, increasing evidences are provided that the common view of charcoal as a polyaromatic network is too much simplified. Experiments with model compounds indicated that it represents a heterogeneous mixture of thermally altered biomacromolecules with N, O and likely also S substitutions as common features. If produced from a N-rich feedstock, the so called black nitrogen (BN) has to be considered as an integral part of the aromatic charcoal network. In order to study this network one-dimensional (1D) solid-state nuclear magnetic resonance (NMR) spectroscopy is often applied. However, this technique suffers from broad resonance lines and low resolution. Applying 2D techniques can help but until recently, this was unfeasible for natural organic matter (NOM) due to sensitivity problems and the high complexity of the material. On the other hand, during the last decade, the development of stronger magnetic field instruments and advanced pulse sequences has put them into reach for NOM research. Although 2D NMR spectroscopy has many different applications, all pulse sequences are based on the introduction of a preparation time during which the magnetization of a spin system is adjusted into a state appropriate to whatever properties are to be detected in the indirect dimension. Then, the spins are allowed to evolve with the given conditions and after their additional manipulation during a mixing period the modulated magnetization is detected. Assembling several 1D spectra with incrementing evolution time creates a data set which is two-dimensional in time (t1, t2). Fourier transformation of both dimensions leads to a 2D contour plot correlating the interactions detected in the indirect dimension t1 with the signals detected in the direct dimension t2. The so called solid-state heteronuclear correlation (HETCOR) NMR spectroscopy represents a 2D technique allows the determination which protons are interacting with which carbons. In the present work this technique was used for monitoring the chemical changes occurring during charring of biomass derived from model compounds, fire-affected and unaffected NOM. The 2D 13C HETCOR NMR spectrum of the fire- unaffected soils revealed that most of the carboxyl C occurs as ester or amide. Aside from cross peaks typically seen in spectra of NOM, the spectrum of the respective fire-affected counterpart shows additional signals assignable to PyOM.
White blood cell segmentation by circle detection using electromagnetism-like optimization.
Cuevas, Erik; Oliva, Diego; Díaz, Margarita; Zaldivar, Daniel; Pérez-Cisneros, Marco; Pajares, Gonzalo
2013-01-01
Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC) images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO) algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability.
White Blood Cell Segmentation by Circle Detection Using Electromagnetism-Like Optimization
Oliva, Diego; Díaz, Margarita; Zaldivar, Daniel; Pérez-Cisneros, Marco; Pajares, Gonzalo
2013-01-01
Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC) images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO) algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability. PMID:23476713
Overlapping Community Detection based on Network Decomposition
NASA Astrophysics Data System (ADS)
Ding, Zhuanlian; Zhang, Xingyi; Sun, Dengdi; Luo, Bin
2016-04-01
Community detection in complex network has become a vital step to understand the structure and dynamics of networks in various fields. However, traditional node clustering and relatively new proposed link clustering methods have inherent drawbacks to discover overlapping communities. Node clustering is inadequate to capture the pervasive overlaps, while link clustering is often criticized due to the high computational cost and ambiguous definition of communities. So, overlapping community detection is still a formidable challenge. In this work, we propose a new overlapping community detection algorithm based on network decomposition, called NDOCD. Specifically, NDOCD iteratively splits the network by removing all links in derived link communities, which are identified by utilizing node clustering technique. The network decomposition contributes to reducing the computation time and noise link elimination conduces to improving the quality of obtained communities. Besides, we employ node clustering technique rather than link similarity measure to discover link communities, thus NDOCD avoids an ambiguous definition of community and becomes less time-consuming. We test our approach on both synthetic and real-world networks. Results demonstrate the superior performance of our approach both in computation time and accuracy compared to state-of-the-art algorithms.
NASA Astrophysics Data System (ADS)
Sabol, Bruce M.
2005-09-01
There has been a longstanding need for an objective and cost-effective technique to detect, characterize, and quantify submersed aquatic vegetation at spatial scales between direct physical sampling and remote aerial-based imaging. Acoustic-based approaches for doing so are reviewed and an explicit approach, using a narrow, single-beam echosounder, is described in detail. This heuristic algorithm is based on the spatial distribution of a thresholded signal generated from a high-frequency, narrow-beam echosounder operated in a vertical orientation from a survey boat. The physical basis, rationale, and implementation of this algorithm are described, and data documenting performance are presented. Using this technique, it is possible to generate orders of magnitude more data than would be available using previous techniques with a comparable level of effort. Thus, new analysis and interpretation approaches are called for which can make full use of these data. Several analyses' examples are shown for environmental effects application studies. Current operational window and performance limitations are identified and thoughts on potential processing approaches to improve performance are discussed.
A deterministic compressive sensing model for bat biosonar.
Hague, David A; Buck, John R; Bilik, Igal
2012-12-01
The big brown bat (Eptesicus fuscus) uses frequency modulated (FM) echolocation calls to accurately estimate range and resolve closely spaced objects in clutter and noise. They resolve glints spaced down to 2 μs in time delay which surpasses what traditional signal processing techniques can achieve using the same echolocation call. The Matched Filter (MF) attains 10-12 μs resolution while the Inverse Filter (IF) achieves higher resolution at the cost of significantly degraded detection performance. Recent work by Fontaine and Peremans [J. Acoustic. Soc. Am. 125, 3052-3059 (2009)] demonstrated that a sparse representation of bat echolocation calls coupled with a decimating sensing method facilitates distinguishing closely spaced objects over realistic SNRs. Their work raises the intriguing question of whether sensing approaches structured more like a mammalian auditory system contains the necessary information for the hyper-resolution observed in behavioral tests. This research estimates sparse echo signatures using a gammatone filterbank decimation sensing method which loosely models the processing of the bat's auditory system. The decimated filterbank outputs are processed with [script-l](1) minimization. Simulations demonstrate that this model maintains higher resolution than the MF and significantly better detection performance than the IF for SNRs of 5-45 dB while undersampling the return signal by a factor of six.
Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.
2015-01-01
Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.
Charged-particle emission tomography
NASA Astrophysics Data System (ADS)
Ding, Yijun
Conventional charged-particle imaging techniques--such as autoradiography-- provide only two-dimensional (2D) images of thin tissue slices. To get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick sections, thus increasing laboratory throughput and eliminating distortions due to registration. In CPET, molecules or cells of interest are labeled so that they emit charged particles without significant alteration of their biological function. Therefore, by imaging the source of the charged particles, one can gain information about the distribution of the molecules or cells of interest. Two special case of CPET include beta emission tomography (BET) and alpha emission tomography (alphaET), where the charged particles employed are fast electrons and alpha particles, respectively. A crucial component of CPET is the charged-particle detector. Conventional charged-particle detectors are sensitive only to the 2-D positions of the detected particles. We propose a new detector concept, which we call particle-processing detector (PPD). A PPD measures attributes of each detected particle, including location, direction of propagation, and/or the energy deposited in the detector. Reconstruction algorithms for CPET are developed, and reconstruction results from simulated data are presented for both BET and alphaET. The results show that, in addition to position, direction and energy provide valuable information for 3D reconstruction of CPET. Several designs of particle-processing detectors are described. Experimental results for one detector are discussed. With appropriate detector design and careful data analysis, it is possible to measure direction and energy, as well as position of each detected particle. The null functions of CPET with PPDs that measure different combinations of attributes are calculated through singular-value decomposition. In general, the more particle attributes are measured from each detection event, the smaller the null space of CPET is. In other words, the higher dimension the data space is, the more information about an object can be recovered from CPET.
Visualization techniques for computer network defense
NASA Astrophysics Data System (ADS)
Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew
2011-06-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.
A surface plasmon resonance based biochip for the detection of patulin toxin
NASA Astrophysics Data System (ADS)
Pennacchio, Anna; Ruggiero, Giuseppe; Staiano, Maria; Piccialli, Gennaro; Oliviero, Giorgia; Lewkowicz, Aneta; Synak, Anna; Bojarski, Piotr; D'Auria, Sabato
2014-08-01
Patulin is a toxic secondary metabolite of a number of fungal species belonging to the genera Penicillium and Aspergillus. One important aspect of the patulin toxicity in vivo is an injury of the gastrointestinal tract including ulceration and inflammation of the stomach and intestine. Recently, patulin has been shown to be genotoxic by causing oxidative damage to the DNA, and oxidative DNA base modifications have been considered to play a role in mutagenesis and cancer initiation. Conventional analytical methods for patulin detection involve chromatographic analyses, such as HPLC, GC, and, more recently, techniques such as LC/MS and GC/MS. All of these methods require the use of extensive protocols and the use of expensive analytical instrumentation. In this work, the conjugation of a new derivative of patulin to the bovine serum albumin for the production of polyclonal antibodies is described, and an innovative competitive immune-assay for detection of patulin is presented. Experimentally, an important part of the detection method is based on the optical technique called surface plasmon resonance (SPR). Laser beam induced interactions between probe and target molecules in the vicinity of gold surface of the biochip lead to the shift in resonance conditions and consequently to slight but easily detectable change of reflectivity.
2017-01-01
Singular Perturbations represent an advantageous theory to deal with systems characterized by a two-time scale separation, such as the longitudinal dynamics of aircraft which are called phugoid and short period. In this work, the combination of the NonLinear Geometric Approach and the Singular Perturbations leads to an innovative Fault Detection and Isolation system dedicated to the isolation of faults affecting the air data system of a general aviation aircraft. The isolation capabilities, obtained by means of the approach proposed in this work, allow for the solution of a fault isolation problem otherwise not solvable by means of standard geometric techniques. Extensive Monte-Carlo simulations, exploiting a high fidelity aircraft simulator, show the effectiveness of the proposed Fault Detection and Isolation system. PMID:28946673
Episodic inflation events at Akutan Volcano, Alaska, during 2005-2017
NASA Astrophysics Data System (ADS)
Ji, Kang Hyeun; Yun, Sang-Ho; Rim, Hyoungrea
2017-08-01
Detection of weak volcano deformation helps constrain characteristics of eruption cycles. We have developed a signal detection technique, called the Targeted Projection Operator (TPO), to monitor surface deformation with Global Positioning System (GPS) data. We have applied the TPO to GPS data collected at Akutan Volcano from June 2005 to March 2017 and detected four inflation events that occurred in 2008, 2011, 2014, and 2016 with inflation rates of about 8-22 mm/yr above the background trend at a near-source site AV13. Numerical modeling suggests that the events should be driven by closely located sources or a single source in a shallow magma chamber at a depth of about 4 km. The inflation events suggest that magma has episodically accumulated in a shallow magma chamber.
High-speed reference-beam-angle control technique for holographic memory drive
NASA Astrophysics Data System (ADS)
Yamada, Ken-ichiro; Ogata, Takeshi; Hosaka, Makoto; Fujita, Koji; Okuyama, Atsushi
2016-09-01
We developed a holographic memory drive for next-generation optical memory. In this study, we present the key technology for achieving a high-speed transfer rate for reproduction, that is, a high-speed control technique for the reference beam angle. In reproduction in a holographic memory drive, there is the issue that the optimum reference beam angle during reproduction varies owing to distortion of the medium. The distortion is caused by, for example, temperature variation, beam irradiation, and moisture absorption. Therefore, a reference-beam-angle control technique to position the reference beam at the optimum angle is crucial. We developed a new optical system that generates an angle-error-signal to detect the optimum reference beam angle. To achieve the high-speed control technique using the new optical system, we developed a new control technique called adaptive final-state control (AFSC) that adds a second control input to the first one derived from conventional final-state control (FSC) at the time of angle-error-signal detection. We established an actual experimental system employing AFSC to achieve moving control between each page (Page Seek) within 300 µs. In sequential multiple Page Seeks, we were able to realize positioning to the optimum angles of the reference beam that maximize the diffracted beam intensity. We expect that applying the new control technique to the holographic memory drive will enable a giga-bit/s-class transfer rate.
Episodic Upwelling of Zooplankton within a Bowhead Whale Feeding Area Near Barrow, AK
2011-09-30
the Beaufort year-round. Bowhead whales vocalize using both calls and songs . There was distinct seasonal variability in the detection of the...different species’ calls/ songs . Calls/ songs from whale species were detected in fall and declined as ice concentration in the mooring vicinity increased...Figs. 4 & 5). In the spring, however, whale calls/ songs were detected beginning in April when the region was still covered with ice, and continued
Evaluation of the efficacy of a portable LIBS system for detection of CWA on surfaces.
L'Hermite, D; Vors, E; Vercouter, T; Moutiers, G
2016-05-01
Laser-induced breakdown spectroscopy (LIBS) is a laser-based optical technique particularly suited for in situ surface analysis. A portable LIBS instrument was tested to detect surface chemical contamination by chemical warfare agents (CWAs). Test of detection of surface contamination was carried out in a toxlab facility with four CWAs, sarin (GB), lewisite (L1), mustard gas (HD), and VX, which were deposited on different substrates, wood, concrete, military green paint, gloves, and ceramic. The CWAs were detected by means of the detection of atomic markers (As, P, F, Cl, and S). The LIBS instrument can give a direct response in terms of detection thanks to an integrated interface for non-expert users or so called end-users. We have evaluated the capability of automatic detection of the selected CWAs. The sensitivity of our portable LIBS instrument was confirmed for the detection of a CWA at surface concentrations above 15 μg/cm(2). The simultaneous detection of two markers may lead to a decrease of the number of false positive.
Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A
2008-06-01
DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.
Identification of PAH Isomeric Structure in Cosmic Dust Analogs: The AROMA Setup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabbah, Hassan; Bonnamy, Anthony; Joblin, Christine
We developed a new analytical experimental setup called AROMA (Astrochemistry Research of Organics with Molecular Analyzer) that combines laser desorption/ionization techniques with ion trap mass spectrometry. We report here on the ability of the apparatus to detect aromatic species in complex materials of astrophysical interest and characterize their structures. A limit of detection of 100 femto-grams has been achieved using pure polycyclic aromatic hydrocarbon (PAH) samples, which corresponds to 2 × 10{sup 8} molecules in the case of coronene (C{sub 24}H{sub 12}). We detected the PAH distribution in the Murchison meteorite, which is made of a complex mixture of extraterrestrialmore » organic compounds. In addition, collision induced dissociation experiments were performed on selected species detected in Murchison, which led to the first firm identification of pyrene and its methylated derivatives in this sample.« less
The dawn of the liquid biopsy in the fight against cancer
Domínguez-Vigil, Irma G.; Moreno-Martínez, Ana K.; Wang, Julia Y.; Roehrl, Michael H.A.; Barrera-Saldaña, Hugo A.
2018-01-01
Cancer is a molecular disease associated with alterations in the genome, which, thanks to the highly improved sensitivity of mutation detection techniques, can be identified in cell-free DNA (cfDNA) circulating in blood, a method also called liquid biopsy. This is a non-invasive alternative to surgical biopsy and has the potential of revealing the molecular signature of tumors to aid in the individualization of treatments. In this review, we focus on cfDNA analysis, its advantages, and clinical applications employing genomic tools (NGS and dPCR) particularly in the field of oncology, and highlight its valuable contributions to early detection, prognosis, and prediction of treatment response. PMID:29416824
Calculation of Cumulative Distributions and Detection Probabilities in Communications and Optics.
1986-03-31
result, Figure 3.1 shows the additional SNR required (often called the CFAR loss) for the MLD, CMLD , and OSD in a multiple target environment to...Notice that although the CFAR loss increases with INR for the MLD, the CMLD and OSD have a bounded loss as the INR + w. These results have been more...false-alarm rate ( CFAR ) when the background noise level is unknown. In Section 2 we described the application of saddlepoint integration techniques to
2013-04-11
in the top monomolecular layer of a blend film using mass spectrometry. This technique we call Surface Layer-Matrix Assisted Laser Desorption...C., Foster, M.D. “Probing Surface Concentration of Cyclic/linear Blend Films Using Surface MALDI-TOF Mass Spectrometry,” Dept. of Polymer Science...Isotopically Labeled Species in a Polymer Blend Using Tip Enhanced Raman Spectroscopy, ACS Macro Letters (11 2012) TOTAL: 2 Books Number of Manuscripts
Prevention of bacterial foodborne disease using nanobiotechnology.
Billington, Craig; Hudson, J Andrew; D'Sa, Elaine
2014-01-01
Foodborne disease is an important source of expense, morbidity, and mortality for society. Detection and control constitute significant components of the overall management of foodborne bacterial pathogens, and this review focuses on the use of nanosized biological entities and molecules to achieve these goals. There is an emphasis on the use of organisms called bacteriophages (phages: viruses that infect bacteria), which are increasingly being used in pathogen detection and biocontrol applications. Detection of pathogens in foods by conventional techniques is time-consuming and expensive, although it can also be sensitive and accurate. Nanobiotechnology is being used to decrease detection times and cost through the development of biosensors, exploiting specific cell-recognition properties of antibodies and phage proteins. Although sensitivity per test can be excellent (eg, the detection of one cell), the very small volumes tested mean that sensitivity per sample is less compelling. An ideal detection method needs to be inexpensive, sensitive, and accurate, but no approach yet achieves all three. For nanobiotechnology to displace existing methods (culture-based, antibody-based rapid methods, or those that detect amplified nucleic acid) it will need to focus on improving sensitivity. Although manufactured nonbiological nanoparticles have been used to kill bacterial cells, nanosized organisms called phages are increasingly finding favor in food safety applications. Phages are amenable to protein and nucleic acid labeling, and can be very specific, and the typical large "burst size" resulting from phage amplification can be harnessed to produce a rapid increase in signal to facilitate detection. There are now several commercially available phages for pathogen control, and many reports in the literature demonstrate efficacy against a number of foodborne pathogens on diverse foods. As a method for control of pathogens, nanobiotechnology is therefore flourishing.
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911
ERIC Educational Resources Information Center
Sharkey, Sonya; Denke, Linda; Herbert, Morley A.
2016-01-01
To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A…
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
Gravitational Wave Detection of Compact Binaries Through Multivariate Analysis
NASA Astrophysics Data System (ADS)
Atallah, Dany Victor; Dorrington, Iain; Sutton, Patrick
2017-01-01
The first detection of gravitational waves (GW), GW150914, as produced by a binary black hole merger, has ushered in the era of GW astronomy. The detection technique used to find GW150914 considered only a fraction of the information available describing the candidate event: mainly the detector signal to noise ratios and chi-squared values. In hopes of greatly increasing detection rates, we want to take advantage of all the information available about candidate events. We employ a technique called Multivariate Analysis (MVA) to improve LIGO sensitivity to GW signals. MVA techniques are efficient ways to scan high dimensional data spaces for signal/noise classification. Our goal is to use MVA to classify compact-object binary coalescence (CBC) events composed of any combination of black holes and neutron stars. CBC waveforms are modeled through numerical relativity. Templates of the modeled waveforms are used to search for CBCs and quantify candidate events. Different MVA pipelines are under investigation to look for CBC signals and un-modelled signals, with promising results. One such MVA pipeline used for the un-modelled search can theoretically analyze far more data than the MVA pipelines currently explored for CBCs, potentially making a more powerful classifier. In principle, this extra information could improve the sensitivity to GW signals. We will present the results from our efforts to adapt an MVA pipeline used in the un-modelled search to classify candidate events from the CBC search.
A cross-correlation search for intermediate-duration gravitational waves from GRB magnetars
NASA Astrophysics Data System (ADS)
Coyne, Robert
2015-04-01
Since the discovery of the afterglow in 1997, the progress made in our understanding of gamma-ray bursts (GRBs) has been spectacular. Yet a direct proof of GRB progenitors is still missing. In the last few years, evidence for a long-lived and sustained central engine in GRBs has mounted. This has called attention to the so-called millisecond-magnetar model, which proposes that a highly magnetized, rapidly-rotating neutron star may exist at the heart of some of these events. The advent of advanced gravitational wave detectors such as LIGO and Virgo may enable us to probe directly, for the first time, the nature of GRB progenitors and their byproducts. In this context, we describe a novel application of a generalized cross-correlation technique optimized for the detection of long-duration gravitational wave signals that may be associated with bar-like deformations of GRB magnetars. The detection of these signals would allow us to answer some of the most intriguing questions on the nature of GRB progenitors, and serve as a starting point for a new class of intermediate-duration gravitational wave searches.
A confusing world: what to call histology of three-dimensional tumour margins?
Moehrle, M; Breuninger, H; Röcken, M
2007-05-01
Complete three-dimensional histology of excised skin tumour margins has a long tradition and, unfortunately, a multitude of names as well. Mohs, who introduced it, called it 'microscopically controlled surgery'. Others have described it as 'micrographic surgery', 'Mohs' micrographic surgery', or simply 'Mohs' surgery'. Semantic confusion became truly rampant when variant forms, each useful in its own way for detecting subclinical outgrowths of malignant skin tumours, were later introduced under such names as histographic surgery, systematic histologic control of the tumour bed, histological control of excised tissue margins, the square procedure, the perimeter technique, etc. All of these methods are basically identical in concept. All involve complete, three-dimensional histological visualization and evaluation of excision margins. Their common goal is to detect unseen tumour outgrowths. For greater clarity, the authors of this paper recommend general adoption of '3D histology' as a collective designation for all the above methods. As an added advantage, 3D histology can also be used in other medical disciplines to confirm true R0 resection of, for example, breast cancer or intestinal cancer.
Crowe, D.E.; Longshore, K.M.
2010-01-01
We estimated relative abundance and density of Western Burrowing Owls (Athene cunicularia hypugaea) at two sites in the Mojave Desert (200304). We made modifications to previously established Burrowing Owl survey techniques for use in desert shrublands and evaluated several factors that might influence the detection of owls. We tested the effectiveness of the call-broadcast technique for surveying this species, the efficiency of this technique at early and late breeding stages, and the effectiveness of various numbers of vocalization intervals during broadcasting sessions. Only 1 (3) of 31 initial (new) owl responses was detected during passive-listening sessions. We found that surveying early in the nesting season was more likely to produce new owl detections compared to surveying later in the nesting season. New owls detected during each of the three vocalization intervals (each consisting of 30 sec of vocalizations followed by 30 sec of silence) of our broadcasting session were similar (37, 40, and 23; n 30). We used a combination of detection trials (sighting probability) and double-observer method to estimate the components of detection probability, i.e., availability and perception. Availability for all sites and years, as determined by detection trials, ranged from 46.158.2. Relative abundance, measured as frequency of occurrence and defined as the proportion of surveys with at least one owl, ranged from 19.232.0 for both sites and years. Density at our eastern Mojave Desert site was estimated at 0.09 ?? 0.01 (SE) owl territories/km2 and 0.16 ?? 0.02 (SE) owl territories/km2 during 2003 and 2004, respectively. In our southern Mojave Desert site, density estimates were 0.09 ?? 0.02 (SE) owl territories/km2 and 0.08 ?? 0.02 (SE) owl territories/km 2 during 2004 and 2005, respectively. ?? 2010 The Raptor Research Foundation, Inc.
Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael
2012-05-01
An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.
Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve
2010-02-01
In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.
NASA Astrophysics Data System (ADS)
Liu, Bin; Gang, Tie; Wan, Chuhao; Wang, Changxi; Luo, Zhiwei
2015-07-01
Vibro-acoustic modulation technique is a nonlinear ultrasonic method in nondestructive testing. This technique detects the defects by monitoring the modulation components generated by the interaction between the vibration and the ultrasound wave due to the nonlinear material behaviour caused by the damage. In this work, a swept frequency signal was used as high frequency excitation, then the Hilbert transform based amplitude and phase demodulation and synchronous demodulation (SD) were used to extract the modulation information from the received signal, the results were graphed in the time-frequency domain after the short time Fourier transform. The demodulation results were quite different from each other. The reason for the difference was investigated by analysing the demodulation process of the two methods. According to the analysis and the subsequent verification test, it was indicated that the SD method was more proper for the test and a new index called MISD was defined to evaluate the structure quality in the Vibro-acoustic modulation test with swept probing excitation.
Assessing the use of multiple sources in student essays.
Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly
2012-09-01
The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.
Microscopic quantification of bacterial invasion by a novel antibody-independent staining method.
Agerer, Franziska; Waeckerle, Stephanie; Hauck, Christof R
2004-10-01
Microscopic discrimination between extracellular and invasive, intracellular bacteria is a valuable technique in microbiology and immunology. We describe a novel fluorescence staining protocol, called FITC-biotin-avidin (FBA) staining, which allows the differentiation between extracellular and intracellular bacteria and is independent of specific antibodies directed against the microorganisms. FBA staining of eukaryotic cells infected with Gram-negative bacteria of the genus Neisseria or the Gram-positive pathogen Staphylococcus aureus are employed to validate the novel technique. The quantitative evaluation of intracellular pathogens by the FBA staining protocol yields identical results compared to parallel samples stained with conventional, antibody-dependent methods. FBA staining eliminates the need for cell permeabilization resulting in robust and rapid detection of invasive microbes. Taken together, FBA staining provides a reliable and convenient alternative for the differential detection of intracellular and extracellular bacteria and should be a valuable technical tool for the quantitative analysis of the invasive properties of pathogenic bacteria and other microorganisms.
NASA Astrophysics Data System (ADS)
Naseralavi, S. S.; Salajegheh, E.; Fadaee, M. J.; Salajegheh, J.
2014-06-01
This paper presents a technique for damage detection in structures under unknown periodic excitations using the transient displacement response. The method is capable of identifying the damage parameters without finding the input excitations. We first define the concept of displacement space as a linear space in which each point represents displacements of structure under an excitation and initial condition. Roughly speaking, the method is based on the fact that structural displacements under free and forced vibrations are associated with two parallel subspaces in the displacement space. Considering this novel geometrical viewpoint, an equation called kernel parallelization equation (KPE) is derived for damage detection under unknown periodic excitations and a sensitivity-based algorithm for solving KPE is proposed accordingly. The method is evaluated via three case studies under periodic excitations, which confirm the efficiency of the proposed method.
Sadat, Umar; Usman, Ammara; Gillard, Jonathan H
2017-07-01
To provide brief overview of the developments regarding use of ultrasmall superparamagnetic particles of iron oxide in imaging pathobiology of carotid atherosclerosis. MRI is a promising technique capable of providing morphological and functional information about atheromatous plaques. MRI using iron oxide particles, called ultrasmall superparamagnetic iron oxide (USPIO) particles, allows detection of macrophages in atherosclerotic tissue. Ferumoxytol has emerged as a new USPIO agent, which has an excellent safety profile. Based on the macrophage-selective properties of ferumoxytol, there is increasing number of recent reports suggesting its effectiveness to detect pathological inflammation. USPIO particles allow magnetic resonance detection of macrophages in atherosclerotic tissue. Ferumoxytol has emerged as a new USPIO agent, with an excellent safety profile. This has the potential to be used for MRI of the pathobiology of atherosclerosis.
Mass Spectrometric Detection of Botulinum Neurotoxin by Measuring its Activity in Serum and Milk
NASA Astrophysics Data System (ADS)
Kalb, Suzanne R.; Pirkle, James L.; Barr, John R.
Botulinum neurotoxins (BoNTs) are bacterial protein toxins which are considered likely agents for bioterrorism due to their extreme toxicity and high availability. A new mass spectrometry based assay called Endopep MS detects and defines the toxin serotype in clinical and food matrices via toxin activity upon a peptide substrate which mimics the toxin's natural target. Furthermore, the subtype of the toxin is differentiated by employing mass spectrometry based proteomic techniques on the same sample. The Endopep-MS assay selectively detects active BoNT and defines the serotype faster and with sensitivity greater than the mouse bioassay. One 96-well plate can be analyzed in under 7 h. On higher level or "hot" samples, the subtype can then be differentiated in less than 2 h with no need for DNA.
Plane wave analysis of coherent holographic image reconstruction by phase transfer (CHIRPT).
Field, Jeffrey J; Winters, David G; Bartels, Randy A
2015-11-01
Fluorescent imaging plays a critical role in a myriad of scientific endeavors, particularly in the biological sciences. Three-dimensional imaging of fluorescent intensity often requires serial data acquisition, that is, voxel-by-voxel collection of fluorescent light emitted throughout the specimen with a nonimaging single-element detector. While nonimaging fluorescence detection offers some measure of scattering robustness, the rate at which dynamic specimens can be imaged is severely limited. Other fluorescent imaging techniques utilize imaging detection to enhance collection rates. A notable example is light-sheet fluorescence microscopy, also known as selective-plane illumination microscopy, which illuminates a large region within the specimen and collects emitted fluorescent light at an angle either perpendicular or oblique to the illumination light sheet. Unfortunately, scattering of the emitted fluorescent light can cause blurring of the collected images in highly turbid biological media. We recently introduced an imaging technique called coherent holographic image reconstruction by phase transfer (CHIRPT) that combines light-sheet-like illumination with nonimaging fluorescent light detection. By combining the speed of light-sheet illumination with the scattering robustness of nonimaging detection, CHIRPT is poised to have a dramatic impact on biological imaging, particularly for in vivo preparations. Here we present the mathematical formalism for CHIRPT imaging under spatially coherent illumination and present experimental data that verifies the theoretical model.
Protein remote homology detection based on bidirectional long short-term memory.
Li, Shumin; Chen, Junjie; Liu, Bin
2017-10-10
Protein remote homology detection plays a vital role in studies of protein structures and functions. Almost all of the traditional machine leaning methods require fixed length features to represent the protein sequences. However, it is never an easy task to extract the discriminative features with limited knowledge of proteins. On the other hand, deep learning technique has demonstrated its advantage in automatically learning representations. It is worthwhile to explore the applications of deep learning techniques to the protein remote homology detection. In this study, we employ the Bidirectional Long Short-Term Memory (BLSTM) to learn effective features from pseudo proteins, also propose a predictor called ProDec-BLSTM: it includes input layer, bidirectional LSTM, time distributed dense layer and output layer. This neural network can automatically extract the discriminative features by using bidirectional LSTM and the time distributed dense layer. Experimental results on a widely-used benchmark dataset show that ProDec-BLSTM outperforms other related methods in terms of both the mean ROC and mean ROC50 scores. This promising result shows that ProDec-BLSTM is a useful tool for protein remote homology detection. Furthermore, the hidden patterns learnt by ProDec-BLSTM can be interpreted and visualized, and therefore, additional useful information can be obtained.
Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping
2016-05-15
Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A large high-efficiency multi-layered Micromegas thermal neutron detector
NASA Astrophysics Data System (ADS)
Tsiledakis, G.; Delbart, A.; Desforge, D.; Giomataris, I.; Menelle, A.; Papaevangelou, T.
2017-09-01
Due to the so-called 3He shortage crisis, many detection techniques used nowadays for thermal neutrons are based on alternative converters. Thin films of 10B or 10B4C are used to convert neutrons into ionizing particles which are subsequently detected in gas proportional counters, but only for small or medium sensitive areas so far. The micro-pattern gaseous detector Micromegas has been developed for several years in Saclay and is used in a wide variety of neutron experiments combining high accuracy, high rate capability, excellent timing properties and robustness. We propose here a large high-efficiency Micromegas-based neutron detector with several 10B4C thin layers mounted inside the gas volume for thermal neutron detection. The principle and the fabrication of a single detector unit prototype with overall dimension of ~ 15 × 15 cm2 and a flexibility of modifying the number of layers of 10B4C neutron converters are described and simulated results are reported, demonstrating that typically five 10B4C layers of 1-2 μm thickness can lead to a detection efficiency of 20-40% for thermal neutrons and a spatial resolution of sub-mm. The design is well adapted to large sizes making possible the construction of a mosaic of several such detector units with a large area coverage and a high detection efficiency, showing the good potential of this novel technique.
NASA Astrophysics Data System (ADS)
Perez Saavedra, L.-M.; Mercier, G.; Yesou, H.; Liege, F.; Pasero, G.
2016-08-01
The Copernicus program of ESA and European commission (6 Sentinels Missions, among them Sentinel-1 with Synthetic Aperture Radar sensor and Sentinel-2 with 13-band 10 to 60 meter resolution optical sensors), offers a new opportunity to Earth Observation with high temporal acquisition capability ( 12 days repetitiveness and 5 days in some geographic areas of the world) with high spatial resolution.Due to these high temporal and spatial resolutions, it opens new challenges in several fields such as image processing, new algorithms for Time Series and big data analysis. In addition, these missions will be able to analyze several topics of earth temporal evolution such as crop vegetation, water bodies, Land use and Land Cover (LULC), sea and ice information, etc. This is particularly useful for end users and policy makers to detect early signs of damages, vegetation illness, flooding areas, etc.From the state of the art, one can find algorithms and methods that use a bi-date comparison for change detection [1-3] or time series analysis. Actually, these methods are essentially used for target detection or for abrupt change detection that requires 2 observations only.A Hölder means-based change detection technique has been proposed in [2,3] for high resolution radar images. This so-called MIMOSA technique has been mainly dedicated to man-made change detection in urban areas and CARABAS - II project by using a couple of SAR images. An extension to multitemporal change detection technique has been investigated but its application to land use and cover changes still has to be validated.The Hölder Hp is a Time Series pixel by pixel feature extraction and is defined by:H𝑝[X]=[1/n∑ⁿᵢ₌1 Xᴾᵢ]1/p p∈R Hp[X] : N images * S Bandes * t datesn is the number of images in the time series. N > 2Hp (X) is continuous and monotonic increasing in p for - ∞ < p < ∞
Yasmin, Zannatul; Khachatryan, Edward; Lee, Yuan-Hao; Maswadi, Saher; Glickman, Randolph; Nash, Kelly L
2015-02-15
In this work, the assembly of gold nanoparticles of (AuNPs) is used to detect the presence of the biomolecule glutathione (GSH) using a novel technique called "all-optical photoacoustic spectroscopy" (AOPAS). The AOPAS technique coupled with AuNPs forms the basis of a biosensing technique capable of probing the dynamic evolution of nano-bio interfaces within a microscopic volume. Dynamic Light Scattering (DLS) and ultraviolet-visible (UV-vis) spectra were measured to describe the kinetics governing the interparticle interactions by monitoring the AuNPs assembly and evolution of the surface plasmon resonance (SPR) band. A comparison of the same dynamic evolution of AuNPs assembly was performed using the AOPAS technique to confirm the validity of this method. The fundamental study is complemented by a demonstration of the performance of this biosensing technique in the presence of cell culture medium containing fetal bovine serum (FBS), which forms a protein corona on the surface of the AuNPs. This work demonstrates that the in vitro monitoring capabilities of the AOPAS provides sensitive measurement at the microscopic level and low nanoparticle concentrations without the artifacts limiting the use of conventional biosensing methods, such as fluorescent indicators. The AOPAS technique not only provides a facile approach for in vitro biosensing, but also shed a light on the real-time detection of thiol containing oxidative stress biomarkers in live systems using AuNPs. Copyright © 2014 Elsevier B.V. All rights reserved.
Nadeau, Christopher P.; Conway, Courtney J.; Piest, Linden; Burger, William P.
2013-01-01
Broadcasting calls of marsh birds during point-count surveys increases their detection probability and decreases variation in the number of birds detected across replicate surveys. However, multi-species monitoring using call-broadcast may reduce these benefits if birds are reluctant to call once they hear broadcasted calls of other species. We compared a protocol that uses call-broadcast for only one species (Yuma clapper rail [Rallus longirostris yumanensis]) to a protocol that uses call-broadcast for multiple species. We detected more of each of the following species using the multi-species protocol: 25 % more pied-billed grebes, 160 % more American bitterns, 52 % more least bitterns, 388 % more California black rails, 12 % more Yuma clapper rails, 156 % more Virginia rails, 214 % more soras, and 19 % more common gallinules. Moreover, the coefficient of variation was smaller when using the multi-species protocol: 10 % smaller for pied-billed grebes, 38 % smaller for American bitterns, 19 % smaller for least bitterns, 55 % smaller for California black rails, 5 % smaller for Yuma clapper rails, 38 % smaller for Virginia rails, 44 % smaller for soras, and 8 % smaller for common gallinules. Our results suggest that multi-species monitoring approaches may be more effective and more efficient than single-species approaches even when using call-broadcast.
Determination of plutonium in spent nuclear fuel using high resolution X-ray
McIntosh, Kathryn G.; Reilly, Sean D.; Havrilla, George J.
2015-05-30
Characterization of Pu is an essential aspect of safeguards operations at nuclear fuel reprocessing facilities. A novel analysis technique called hiRX (high resolution X-ray) has been developed for the direct measurement of Pu in spent nuclear fuel dissolver solutions. hiRX is based on monochromatic wavelength dispersive X-ray fluorescence (MWDXRF), which provides enhanced sensitivity and specificity compared with conventional XRF techniques. A breadboard setup of the hiRX instrument was calibrated using spiked surrogate spent fuel (SSF) standards prepared as dried residues. Samples of actual spent fuel were utilized to evaluate the performance of the hiRX. The direct detection of just 39more » ng of Pu is demonstrated. Initial quantitative results, with error of 4–27% and precision of 2% relative standard deviation (RSD), were obtained for spent fuel samples. The limit of detection for Pu (100 s) within an excitation spot of 200 μm diameter was 375 pg. This study demonstrates the potential for the hiRX technique to be utilized for the rapid, accurate, and precise determination of Pu. Moreover, the results highlight the analytical capability of hiRX for other applications requiring sensitive and selective nondestructive analyses.« less
NASA Astrophysics Data System (ADS)
Climent-Font, A.; Cervera, M.; Hernández, M. J.; Muñoz-Martín, A.; Piqueras, J.
2008-04-01
Rutherford backscattering spectrometry (RBS) is a well known powerful technique to obtain depth profiles of the constituent elements in a thin film deposited on a substrate made of lighter elements. In its standard use the probing beam is typically 2 MeV He. Its capabilities to obtain precise composition profiles are severely diminished when the overlaying film is made of elements lighter than the substrate. In this situation the analysis of the energy of the recoiled element from the sample in the elastic scattering event, the ERDA technique may be advantageous. For the detection of light elements it is also possible to use beams at specific energies producing elastic resonances with these light elements to be analyzed, with a much higher scattering cross sections than the Rutherford values. This technique may be called non-RBS. In this work we report on the complementary use of ERDA with a 30 MeV Cl beam and non-RBS with 1756 keV H ions to characterize thin films made of boron, carbon and nitrogen (BCN) deposited on Si substrates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIntosh, Kathryn G.; Reilly, Sean D.; Havrilla, George J.
Characterization of Pu is an essential aspect of safeguards operations at nuclear fuel reprocessing facilities. A novel analysis technique called hiRX (high resolution X-ray) has been developed for the direct measurement of Pu in spent nuclear fuel dissolver solutions. hiRX is based on monochromatic wavelength dispersive X-ray fluorescence (MWDXRF), which provides enhanced sensitivity and specificity compared with conventional XRF techniques. A breadboard setup of the hiRX instrument was calibrated using spiked surrogate spent fuel (SSF) standards prepared as dried residues. Samples of actual spent fuel were utilized to evaluate the performance of the hiRX. The direct detection of just 39more » ng of Pu is demonstrated. Initial quantitative results, with error of 4–27% and precision of 2% relative standard deviation (RSD), were obtained for spent fuel samples. The limit of detection for Pu (100 s) within an excitation spot of 200 μm diameter was 375 pg. This study demonstrates the potential for the hiRX technique to be utilized for the rapid, accurate, and precise determination of Pu. Moreover, the results highlight the analytical capability of hiRX for other applications requiring sensitive and selective nondestructive analyses.« less
Visualization Techniques for Computer Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Justin M; Steed, Chad A; Patton, Robert M
2011-01-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less
Pairing call-response surveys and distance sampling for a mammalian carnivore
Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.
2015-01-01
Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys
Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.
2007-01-01
Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.
Characterization of ultrafast devices using novel optical techniques
NASA Astrophysics Data System (ADS)
Ali, Md Ershad
Optical techniques have been extensively used to examine the high frequency performance of a number of devices including High Electron Mobility Transistors (HEMTs), Heterojunction Bipolar Phototransistors (HPTs) and Low Temperature GaAs (LT-GaAs) Photoconductive Switches. To characterize devices, frequency and time domain techniques, namely optical heterodyning and electro-optic sampling, having measurement bandwidths in excess of 200 GHz, were employed. Optical mixing in three-terminal devices has been extended for the first time to submillimeter wave frequencies. Using a new generation of 50-nm gate pseudomorphic InP-based HEMTs, optically mixed signals were detected to 552 GHz with a signal-to-noise ratio of approximately 5 dB. To the best of our knowledge, this is the highest frequency optical mixing obtained in three- terminal devices to date. A novel harmonic three-wave detection scheme was used for the detection of the optically generated signals. The technique involved downconversion of the signal in the device by the second harmonic of a gate-injected millimeter wave local oscillator. Measurements were also conducted up to 212 GHz using direct optical mixing and up to 382 GHz using a fundamental three-wave detection scheme. New interesting features in the bias dependence of the optically mixed signals have been reported. An exciting novel development from this work is the successful integration of near-field optics with optical heterodyning. The technique, called near-field optical heterodyning (NFOH), allows for extremely localized injection of high-frequency stimulus to any arbitrary point of an ultrafast device or circuit. Scanning the point of injection across the sample provides details of the high frequency operation of the device with high spatial resolution. For the implementation of the technique, fiber-optic probes with 100 nm apertures were fabricated. A feedback controlled positioning system was built for accurate placement and scanning of the fiber probe with nanometric precision. The applicability of the NFOH technique was first confirmed by measurements on heterojunction phototransistors at 100 GHz. Later NFOH scans were performed at 63 GHz on two other important devices, HEMTs and LT-GaAs Photoconductive Switches. Spatially resolved response characteristics of these devices revealed interesting details of their operation.
Llusia, Diego; Márquez, Rafael; Beltrán, Juan F; Benítez, Maribel; do Amaral, José P
2013-09-01
Calling behaviour is strongly temperature-dependent and critical for sexual selection and reproduction in a variety of ectothermic taxa, including anuran amphibians, which are the most globally threatened vertebrates. However, few studies have explored how species respond to distinct thermal environments at time of displaying calling behaviour, and thus it is still unknown whether ongoing climate change might compromise the performance of calling activity in ectotherms. Here, we used new audio-trapping techniques (automated sound recording and detection systems) between 2006 and 2009 to examine annual calling temperatures of five temperate anurans and their patterns of geographical and seasonal variation at the thermal extremes of species ranges, providing insights into the thermal breadths of calling activity of species, and the mechanisms that enable ectotherms to adjust to changing thermal environments. All species showed wide thermal breadths during calling behaviour (above 15 °C) and increases in calling temperatures in extremely warm populations and seasons. Thereby, calling temperatures differed both geographically and seasonally, both in terrestrial and aquatic species, and were 8-22 °C below the specific upper critical thermal limits (CTmax ) and strongly associated with the potential temperatures of each thermal environment (operative temperatures during the potential period of breeding). This suggests that calling behaviour in ectotherms may take place at population-specific thermal ranges, diverging when species are subjected to distinct thermal environments, and might imply plasticity of thermal adjustment mechanisms (seasonal and developmental acclimation) that supply species with means of coping with climate change. Furthermore, the thermal thresholds of calling at the onset of the breeding season were dissimilar between conspecific populations, suggesting that other factors besides temperature are needed to trigger the onset of reproduction. Our findings imply that global warming would not directly inhibit calling behaviour in the study species, although might affect other temperature-dependent features of their acoustic communication system. © 2013 John Wiley & Sons Ltd.
Acharya, U. Rajendra; Sree, S. Vinitha; Kulshreshtha, Sanjeev; Molinari, Filippo; Koh, Joel En Wei; Saba, Luca; Suri, Jasjit S.
2014-01-01
Ovarian cancer is the fifth highest cause of cancer in women and the leading cause of death from gynecological cancers. Accurate diagnosis of ovarian cancer from acquired images is dependent on the expertise and experience of ultrasonographers or physicians, and is therefore, associated with inter observer variabilities. Computer Aided Diagnostic (CAD) techniques use a number of different data mining techniques to automatically predict the presence or absence of cancer, and therefore, are more reliable and accurate. A review of published literature in the field of CAD based ovarian cancer detection indicates that many studies use ultrasound images as the base for analysis. The key objective of this work is to propose an effective adjunct CAD technique called GyneScan for ovarian tumor detection in ultrasound images. In our proposed data mining framework, we extract several texture features based on first order statistics, Gray Level Co-occurrence Matrix and run length matrix. The significant features selected using t-test are then used to train and test several supervised learning based classifiers such as Probabilistic Neural Networks (PNN), Support Vector Machine (SVM), Decision Tree (DT), k-Nearest Neighbor (KNN), and Naïve Bayes (NB). We evaluated the developed framework using 1300 benign and 1300 malignant images. Using 11 significant features in KNN/PNN classifiers, we were able to achieve 100% classification accuracy, sensitivity, specificity, and positive predictive value in detecting ovarian tumor. Even though more validation using larger databases would better establish the robustness of our technique, the preliminary results are promising. This technique could be used as a reliable adjunct method to existing imaging modalities to provide a more confident second opinion on the presence/absence of ovarian tumor. PMID:24325128
NASA Astrophysics Data System (ADS)
Swaminathan, K.; Asokane, C.; Sylvia, J. I.; Kalyanasundaram, P.; Swaminathan, P.
2012-02-01
An ultrasonic under-sodium scanner has been developed for deployment in Prototype Fast Breeder Reactor (PFBR) which is in advanced stage of construction at Kalpakkam, India. Its purpose is to scan the above-core plenum for detection, if any, of displacement of sub-assemblies. During its burn-up in the reactor, the head of a Fuel Sub-Assembly (FSA) may undergo a lateral shift from its original position (called `bowing') due to the fast neutron induced damage on its structural material. A simple scanning technique has been developed for measuring the extent of bowing in-situ. This paper describes a PC-controlled mock-up of the scanner used to implement the scanning technique and the results obtained of scanning a mock-up FSA head under water. The details of the liquid-sodium proof transducer developed for use in the PFBR scanner and its performance are also discussed.
A Touch Sensing Technique Using the Effects of Extremely Low Frequency Fields on the Human Body
Elfekey, Hatem; Bastawrous, Hany Ayad; Okamoto, Shogo
2016-01-01
Touch sensing is a fundamental approach in human-to-machine interfaces, and is currently under widespread use. Many current applications use active touch sensing technologies. Passive touch sensing technologies are, however, more adequate to implement low power or energy harvesting touch sensing interfaces. This paper presents a passive touch sensing technique based on the fact that the human body is affected by the surrounding extremely low frequency (ELF) electromagnetic fields, such as those of AC power lines. These external ELF fields induce electric potentials on the human body—because human tissues exhibit some conductivity at these frequencies—resulting in what is called AC hum. We therefore propose a passive touch sensing system that detects this hum noise when a human touch occurs, thus distinguishing between touch and non-touch events. The effectiveness of the proposed technique is validated by designing and implementing a flexible touch sensing keyboard. PMID:27918416
A Touch Sensing Technique Using the Effects of Extremely Low Frequency Fields on the Human Body.
Elfekey, Hatem; Bastawrous, Hany Ayad; Okamoto, Shogo
2016-12-02
Touch sensing is a fundamental approach in human-to-machine interfaces, and is currently under widespread use. Many current applications use active touch sensing technologies. Passive touch sensing technologies are, however, more adequate to implement low power or energy harvesting touch sensing interfaces. This paper presents a passive touch sensing technique based on the fact that the human body is affected by the surrounding extremely low frequency (ELF) electromagnetic fields, such as those of AC power lines. These external ELF fields induce electric potentials on the human body-because human tissues exhibit some conductivity at these frequencies-resulting in what is called AC hum. We therefore propose a passive touch sensing system that detects this hum noise when a human touch occurs, thus distinguishing between touch and non-touch events. The effectiveness of the proposed technique is validated by designing and implementing a flexible touch sensing keyboard.
Rouleau, Etienne; Lefol, Cédrick; Bourdon, Violaine; Coulet, Florence; Noguchi, Tetsuro; Soubrier, Florent; Bièche, Ivan; Olschwang, Sylviane; Sobol, Hagay; Lidereau, Rosette
2009-06-01
Several techniques have been developed to screen mismatch repair (MMR) genes for deleterious mutations. Until now, two different techniques were required to screen for both point mutations and large rearrangements. For the first time, we propose a new approach, called "quantitative PCR (qPCR) high-resolution melting (HRM) curve analysis (qPCR-HRM)," which combines qPCR and HRM to obtain a rapid and cost-effective method suitable for testing a large series of samples. We designed PCR amplicons to scan the MLH1 gene using qPCR HRM. Seventy-six patients were fully scanned in replicate, including 14 wild-type patients and 62 patients with known mutations (57 point mutations and five rearrangements). To validate the detected mutations, we used sequencing and/or hybridization on a dedicated MLH1 array-comparative genomic hybridization (array-CGH). All point mutations and rearrangements detected by denaturing high-performance liquid chromatography (dHPLC)+multiplex ligation-dependent probe amplification (MLPA) were successfully detected by qPCR HRM. Three large rearrangements were characterized with the dedicated MLH1 array-CGH. One variant was detected with qPCR HRM in a wild-type patient and was located within the reverse primer. One variant was not detected with qPCR HRM or with dHPLC due to its proximity to a T-stretch. With qPCR HRM, prescreening for point mutations and large rearrangements are performed in one tube and in one step with a single machine, without the need for any automated sequencer in the prescreening process. In replicate, its reagent cost, sensitivity, and specificity are comparable to those of dHPLC+MLPA techniques. However, qPCR HRM outperformed the other techniques in terms of its rapidity and amount of data provided.
Kasama, Toshihiro; Kaji, Noritada; Tokeshi, Manabu; Baba, Yoshinobu
2017-01-01
Due to the inherent characteristics including confinement of molecular diffusion and high surface-to-volume ratio, microfluidic device-based immunoassay has great advantages in cost, speed, sensitivity, and so on, compared with conventional techniques such as microtiter plate-based ELISA, latex agglutination method, and lateral flow immunochromatography. In this paper, we explain the detection of C-reactive protein as a model antigen by using our microfluidic immunoassay device, so-called immuno-pillar device. We describe in detail how we fabricated and used the immuno-pillar devices.
High order filtering methods for approximating hyperbolic systems of conservation laws
NASA Technical Reports Server (NTRS)
Lafon, F.; Osher, S.
1991-01-01
The essentially nonoscillatory (ENO) schemes, while potentially useful in the computation of discontinuous solutions of hyperbolic conservation-law systems, are computationally costly relative to simple central-difference methods. A filtering technique is presented which employs central differencing of arbitrarily high-order accuracy except where a local test detects the presence of spurious oscillations and calls upon the full ENO apparatus to remove them. A factor-of-three speedup is thus obtained over the full-ENO method for a wide range of problems, with high-order accuracy in regions of smooth flow.
Status of the LISA On Table experiment: a electro-optical simulator for LISA
NASA Astrophysics Data System (ADS)
Laporte, M.; Halloin, H.; Bréelle, E.; Buy, C.; Grüning, P.; Prat, P.
2017-05-01
The LISA project is a space mission that aim at detecting gravitational waves in space. An electro-optical simulator called LISA On Table (LOT) is being developed at APC in order to test noise reduction techniques (such as Timed Delayed Interferometry) and instruments that will be used. This document presents its latest results: TimeDelayed Interferometry of 1st generation works in the case of a simulated white noise with static, unequal arms. Future and ongoing developments of the experiment are also addressed.
NASA Astrophysics Data System (ADS)
Lawrence, Chris C.; Polack, J. K.; Febbraro, Michael; Kolata, J. J.; Flaska, Marek; Pozzi, S. A.; Becchetti, F. D.
2017-02-01
The literature discussing pulse-shape discrimination (PSD) in organic scintillators dates back several decades. However, little has been written about PSD techniques that are optimized for neutron spectrum unfolding. Variation in n-γ misclassification rates and in γ/n ratio of incident fields can distort the neutron pulse-height response of scintillators and these distortions can in turn cause large errors in unfolded spectra. New applications in arms-control verification call for detection of lower-energy neutrons, for which PSD is particularly problematic. In this article, we propose techniques for removing distortions on pulse-height response that result from the merging of PSD distributions in the low-pulse-height region. These techniques take advantage of the repeatable shapes of PSD distributions that are governed by the counting statistics of scintillation-photon populations. We validate the proposed techniques using accelerator-based time-of-flight measurements and then demonstrate them by unfolding the Watt spectrum from measurement with a 252Cf neutron source.
Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud. PMID:26271043
Development of dual PZT transducers for reference-free crack detection in thin plate structures.
Sohn, Hoon; Kim, Seuno Bum
2010-01-01
A new Lamb-wave-based nondestructive testing (NDT) technique, which does not rely on previously stored baseline data, is developed for crack monitoring in plate structures. Commonly, the presence of damage is identified by comparing "current data" measured from a potentially damaged stage of a structure with "baseline data" previously obtained at the intact condition of the structure. In practice, structural defects typically take place long after collection of the baseline data, and the baseline data can be also affected by external loading, temperature variations, and changing boundary conditions. To eliminate the dependence on the baseline data comparison, the authors previously developed a reference-free NDT technique using 2 pairs of collocated lead zirconate titanate (PZT) transducers placed on both sides of a plate. This reference-free technique is further advanced in the present study by the necessity of attaching transducers only on a single surface of a structure for certain applications such as aircraft. To achieve this goal, a new design of PZT transducers called dual PZT transducers is proposed. Crack formation creates Lamb wave mode conversion due to a sudden thickness change of the structure. This crack appearance is instantly detected from the measured Lamb wave signals using the dual PZT transducers. This study also suggests a reference-free statistical approach that enables damage classification using only the currently measured data set. Numerical simulations and experiments were conducted using an aluminum plate with uniform thickness and fundamental Lamb waves modes to demonstrate the applicability of the proposed technique to reference-free crack detection.
Detecting dark matter in the Milky Way with cosmic and gamma radiation
NASA Astrophysics Data System (ADS)
Carlson, Eric C.
Over the last decade, experiments in high-energy astroparticle physics have reached unprecedented precision and sensitivity which span the electromagnetic and cosmic-ray spectra. These advances have opened a new window onto the universe for which little was previously known. Such dramatic increases in sensitivity lead naturally to claims of excess emission, which call for either revised astrophysical models or the existence of exotic new sources such as particle dark matter. Here we stand firmly with Occam, sharpening his razor by (i) developing new techniques for discriminating astrophysical signatures from those of dark matter, and (ii) by developing detailed foreground models which can explain excess signals and shed light on the underlying astrophysical processes at hand. We concentrate most directly on observations of Galactic gamma and cosmic rays, factoring the discussion into three related parts which each contain significant advancements from our cumulative works. In Part I we introduce concepts which are fundamental to the Indirect Detection of particle dark matter, including motivations, targets, experiments, production of Standard Model particles, and a variety of statistical techniques. In Part II we introduce basic and advanced modelling techniques for propagation of cosmic-rays through the Galaxy and describe astrophysical gamma-ray production, as well as presenting state-of-the-art propagation models of the Milky Way.Finally, in Part III, we employ these models and techniques in order to study several indirect detection signals, including the Fermi GeV excess at the Galactic center, the Fermi 135 GeV line, the 3.5 keV line, and the WMAP-Planck haze.
Intentional Voice Command Detection for Trigger-Free Speech Interface
NASA Astrophysics Data System (ADS)
Obuchi, Yasunari; Sumiyoshi, Takashi
In this paper we introduce a new framework of audio processing, which is essential to achieve a trigger-free speech interface for home appliances. If the speech interface works continually in real environments, it must extract occasional voice commands and reject everything else. It is extremely important to reduce the number of false alarms because the number of irrelevant inputs is much larger than the number of voice commands even for heavy users of appliances. The framework, called Intentional Voice Command Detection, is based on voice activity detection, but enhanced by various speech/audio processing techniques such as emotion recognition. The effectiveness of the proposed framework is evaluated using a newly-collected large-scale corpus. The advantages of combining various features were tested and confirmed, and the simple LDA-based classifier demonstrated acceptable performance. The effectiveness of various methods of user adaptation is also discussed.
EAS thermal neutron detection with the PRISMA-LHAASO-16 experiment
NASA Astrophysics Data System (ADS)
Li, B.-B.; Alekseenko, V. V.; Cui, S.-w.; Chen, T.-L.; Dangzengluobu; Feng, S.-H.; Gao, Q.; Liu, Y.; Huang, Q.-C.; He, Y.-Y.; Liu, M.-Y.; Ma, X.-H.; Pozdnyakov, E. I.; Shchegolev, O. B.; Shen, F.-Z.; Stenkin, Yu. V.; Stepanov, V. I.; Yanin, Ya. V.; Yao, J.-D.; Zhou, R.
2017-12-01
EAS (extensive air shower) thermal neutron measurement gives advantages to study energy and mass composition of primary cosmic rays especially in the knee region. After the success of the PRISMA-YBJ experiment, we build a new EAS thermal neutron detection array at Tibet University, Lhasa, China (3700 m a.s.l.) in March, 2017. This prototype array so called "PRISMA-LHAASO-16" consists of 16 EAS EN-detectors ("EN" is abbreviation for electron and neutron) measuring two main EAS components: hadronic and electromagnetic ones. Different from PRISMA-YBJ, these detectors use a thin layer of a novel type of ZnS(Ag) scintillator alloyed with natural boron compound for thermal neutron capture. PRISMA-LHAASO-16 will be moved to the LHAASO site in the near future. In this paper, we introduce principle of the detection technique, deployment of the array, and the test results of the array.
Infrared small target detection based on multiscale center-surround contrast measure
NASA Astrophysics Data System (ADS)
Fu, Hao; Long, Yunli; Zhu, Ran; An, Wei
2018-04-01
Infrared(IR) small target detection plays a critical role in the Infrared Search And Track (IRST) system. Although it has been studied for years, there are some difficulties remained to the clutter environment. According to the principle of human discrimination of small targets from a natural scene that there is a signature of discontinuity between the object and its neighboring regions, we develop an efficient method for infrared small target detection called multiscale centersurround contrast measure (MCSCM). First, to determine the maximum neighboring window size, an entropy-based window selection technique is used. Then, we construct a novel multiscale center-surround contrast measure to calculate the saliency map. Compared with the original image, the MCSCM map has less background clutters and noise residual. Subsequently, a simple threshold is used to segment the target. Experimental results show our method achieves better performance.
Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos
2013-08-14
This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.
Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos
2013-01-01
This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873
Differentiating retroperitoneal liposarcoma tumors with optical coherence tomography
NASA Astrophysics Data System (ADS)
Lev, Dina; Baranov, Stepan A.; Carbajal, Esteban F.; Young, Eric D.; Pollock, Raphael E.; Larin, Kirill V.
2011-03-01
Liposarcoma (LS) is a rare and heterogeneous group of malignant mesenchymal neoplasms exhibiting characteristics of adipocytic differentiation. Currently, radical surgical resection represents the most effective and widely used therapy for patients with abdominal/retroperitoneal LS, but the presence of contiguous essential organs, such as the kidney, pancreas, spleen, adrenal glands, esophagus or colon, as well as often reoccurrence of LS in A/RP calls for the enhancement of surgical techniques to minimize resection and avoid LS reoccurrences. Difficulty in detecting the margins of neoplasms due to their affinity to healthy fat tissue accounts for the high reoccurrence of LS within A/RP. Nowadays, the microscopic detection of margins is possible only by use of biopsy, and the minimization of surgical resection of healthy tissues is challenging. In this presentation we'll demonstrate the initial OCT results for the imaging and distinction of LS and normal human fat tissues and clear detection of tumor boundaries.
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2000-01-01
We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.
Toward the detection of abnormal chest radiographs the way radiologists do it
NASA Astrophysics Data System (ADS)
Alzubaidi, Mohammad; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.
2011-03-01
Computer Aided Detection (CADe) and Computer Aided Diagnosis (CADx) are relatively recent areas of research that attempt to employ feature extraction, pattern recognition, and machine learning algorithms to aid radiologists in detecting and diagnosing abnormalities in medical images. However, these computational methods are based on the assumption that there are distinct classes of abnormalities, and that each class has some distinguishing features that set it apart from other classes. However, abnormalities in chest radiographs tend to be very heterogeneous. The literature suggests that thoracic (chest) radiologists develop their ability to detect abnormalities by developing a sense of what is normal, so that anything that is abnormal attracts their attention. This paper discusses an approach to CADe that is based on a technique called anomaly detection (which aims to detect outliers in data sets) for the purpose of detecting atypical regions in chest radiographs. However, in order to apply anomaly detection to chest radiographs, it is necessary to develop a basis for extracting features from corresponding anatomical locations in different chest radiographs. This paper proposes a method for doing this, and describes how it can be used to support CADe.
Bellin, Daniel L; Sakhtah, Hassan; Rosenstein, Jacob K; Levine, Peter M; Thimot, Jordan; Emmett, Kevin; Dietrich, Lars E P; Shepard, Kenneth L
2014-01-01
Despite advances in monitoring spatiotemporal expression patterns of genes and proteins with fluorescent probes, direct detection of metabolites and small molecules remains challenging. A technique for spatially resolved detection of small molecules would benefit the study of redox-active metabolites that are produced by microbial biofilms and can affect their development. Here we present an integrated circuit-based electrochemical sensing platform featuring an array of working electrodes and parallel potentiostat channels. 'Images' over a 3.25 × 0.9 mm(2) area can be captured with a diffusion-limited spatial resolution of 750 μm. We demonstrate that square wave voltammetry can be used to detect, identify and quantify (for concentrations as low as 2.6 μM) four distinct redox-active metabolites called phenazines. We characterize phenazine production in both wild-type and mutant Pseudomonas aeruginosa PA14 colony biofilms, and find correlations with fluorescent reporter imaging of phenazine biosynthetic gene expression.
Spatial detection of tv channel logos as outliers from the content
NASA Astrophysics Data System (ADS)
Ekin, Ahmet; Braspenning, Ralph
2006-01-01
This paper proposes a purely image-based TV channel logo detection algorithm that can detect logos independently from their motion and transparency features. The proposed algorithm can robustly detect any type of logos, such as transparent and animated, without requiring any temporal constraints whereas known methods have to wait for the occurrence of large motion in the scene and assume stationary logos. The algorithm models logo pixels as outliers from the actual scene content that is represented by multiple 3-D histograms in the YC BC R space. We use four scene histograms corresponding to each of the four corners because the content characteristics change from one image corner to another. A further novelty of the proposed algorithm is that we define image corners and the areas where we compute the scene histograms by a cinematic technique called Golden Section Rule that is used by professionals. The robustness of the proposed algorithm is demonstrated over a dataset of representative TV content.
Alexakis, Dimitrios; Sarris, Apostolos; Astaras, Theodoros; Albanakis, Konstantinos
2009-01-01
Thessaly is a low relief region in Greece where hundreds of Neolithic settlements/tells called magoules were established from the Early Neolithic period until the Bronze Age (6,000 – 3,000 BC). Multi-sensor remote sensing was applied to the study area in order to evaluate its potential to detect Neolithic settlements. Hundreds of sites were geo-referenced through systematic GPS surveying throughout the region. Data from four primary sensors were used, namely Landsat ETM, ASTER, EO1 - HYPERION and IKONOS. A range of image processing techniques were originally applied to the hyperspectral imagery in order to detect the settlements and validate the results of GPS surveying. Although specific difficulties were encountered in the automatic classification of archaeological features composed by a similar parent material with the surrounding landscape, the results of the research suggested a different response of each sensor to the detection of the Neolithic settlements, according to their spectral and spatial resolution. PMID:22399961
Morphological filtering and multiresolution fusion for mammographic microcalcification detection
NASA Astrophysics Data System (ADS)
Chen, Lulin; Chen, Chang W.; Parker, Kevin J.
1997-04-01
Mammographic images are often of relatively low contrast and poor sharpness with non-stationary background or clutter and are usually corrupted by noise. In this paper, we propose a new method for microcalcification detection using gray scale morphological filtering followed by multiresolution fusion and present a unified general filtering form called the local operating transformation for whitening filtering and adaptive thresholding. The gray scale morphological filters are used to remove all large areas that are considered as non-stationary background or clutter variations, i.e., to prewhiten images. The multiresolution fusion decision is based on matched filter theory. In addition to the normal matched filter, the Laplacian matched filter which is directly related through the wavelet transforms to multiresolution analysis is exploited for microcalcification feature detection. At the multiresolution fusion stage, the region growing techniques are used in each resolution level. The parent-child relations between resolution levels are adopted to make final detection decision. FROC is computed from test on the Nijmegen database.
Alexakis, Dimitrios; Sarris, Apostolos; Astaras, Theodoros; Albanakis, Konstantinos
2009-01-01
Thessaly is a low relief region in Greece where hundreds of Neolithic settlements/tells called magoules were established from the Early Neolithic period until the Bronze Age (6,000 - 3,000 BC). Multi-sensor remote sensing was applied to the study area in order to evaluate its potential to detect Neolithic settlements. Hundreds of sites were geo-referenced through systematic GPS surveying throughout the region. Data from four primary sensors were used, namely Landsat ETM, ASTER, EO1 - HYPERION and IKONOS. A range of image processing techniques were originally applied to the hyperspectral imagery in order to detect the settlements and validate the results of GPS surveying. Although specific difficulties were encountered in the automatic classification of archaeological features composed by a similar parent material with the surrounding landscape, the results of the research suggested a different response of each sensor to the detection of the Neolithic settlements, according to their spectral and spatial resolution.
Bellin, Daniel L.; Sakhtah, Hassan; Rosenstein, Jacob K.; Levine, Peter M.; Thimot, Jordan; Emmett, Kevin; Dietrich, Lars E. P.; Shepard, Kenneth L.
2014-01-01
Despite advances in monitoring spatiotemporal expression patterns of genes and proteins with fluorescent probes, direct detection of metabolites and small molecules remains challenging. A technique for spatially resolved detection of small molecules would benefit the study of redox-active metabolites produced by microbial biofilms, which can drastically affect colony development. Here we present an integrated circuit-based electrochemical sensing platform featuring an array of working electrodes and parallel potentiostat channels. “Images” over a 3.25 × 0.9 mm area can be captured with a diffusion-limited spatial resolution of 750 μm. We demonstrate that square wave voltammetry can be used to detect, identify, and quantify (for concentrations as low as 2.6 μM) four distinct redox-active metabolites called phenazines. We characterize phenazine production in both wild-type and mutant Pseudomonas aeruginosa PA14 colony biofilms, and find correlations with fluorescent reporter imaging of phenazine biosynthetic gene expression. PMID:24510163
Flood mapping from Sentinel-1 and Landsat-8 data: a case study from river Evros, Greece
NASA Astrophysics Data System (ADS)
Kyriou, Aggeliki; Nikolakopoulos, Konstantinos
2015-10-01
Floods are suddenly and temporary natural events, affecting areas which are not normally covered by water. The influence of floods plays a significant role both in society and the natural environment, therefore flood mapping is crucial. Remote sensing data can be used to develop flood map in an efficient and effective way. This work is focused on expansion of water bodies overtopping natural levees of the river Evros, invading the surroundings areas and converting them in flooded. Different techniques of flood mapping were used using data from active and passive remote sensing sensors like Sentinlel-1 and Landsat-8 respectively. Space borne pairs obtained from Sentinel-1 were processed in this study. Each pair included an image during the flood, which is called "crisis image" and another one before the event, which is called "archived image". Both images covering the same area were processed producing a map, which shows the spread of the flood. Multispectral data From Landsat-8 were also processed in order to detect and map the flooded areas. Different image processing techniques were applied and the results were compared to the respective results of the radar data processing.
Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation
NASA Technical Reports Server (NTRS)
Koshti, Ajay
2014-01-01
The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.
NASA Astrophysics Data System (ADS)
Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.
2016-09-01
Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.
New target and detection methods: active detectors
NASA Astrophysics Data System (ADS)
Mittig, W.; Savajols, H.; Demonchy, C. E.; Giot, L.; Roussel-Chomaz, P.; Wang, H.; Ter-Akopian, G.; Fomichev, A.; Golovkov, M. S.; Stepansov, S.; Wolski, R.; Alamanos, N.; Drouart, A.; Gillibert, A.; Lapoux, V.; Pollacco, E.
2003-07-01
The study of nuclei far from stability interacting with simple target nuclei, such as protons, deuterons, 3He and 4He implies the use of inverse kinematics. The very special kinematics, together with the low intensities of the beams calls for special techniques. In july 2002 we tested a new detector, in which the detector gas is the target. This allows in principle a 4π solid angle of the detection, and a big effective target thickness without loss of resolution. The detector developped, called Maya, used isobuthane C4H10 as gas in present tests, and other gases are possible. The multiplexed electronics of more than 1000channels allows the reconstruction of the events occuring between the incoming particle and the detector gas atoms in 3D. Here we were interested in the elastic scattering of 8He on protons for the study of the isobaric analogue states (IAS) of 9He. The beam, in this case, is stopped in the detector. The resonance energy is determined by the place of interaction and the energy of the recoiling proton. The design of the detector is shown, together with some preliminary results are discussed.
Riera, Amalis; Ford, John K; Ross Chapman, N
2013-09-01
Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.
Gentilini, Fabio; Turba, Maria E
2014-01-01
A novel technique, called Divergent, for single-tube real-time PCR genotyping of point mutations without the use of fluorescently labeled probes has recently been reported. This novel PCR technique utilizes a set of four primers and a particular denaturation temperature for simultaneously amplifying two different amplicons which extend in opposite directions from the point mutation. The two amplicons can readily be detected using the melt curve analysis downstream to a closed-tube real-time PCR. In the present study, some critical aspects of the original method were specifically addressed to further implement the technique for genotyping the DNM1 c.G767T mutation responsible for exercise-induced collapse in Labrador retriever dogs. The improved Divergent assay was easily set up using a standard two-step real-time PCR protocol. The melting temperature difference between the mutated and the wild-type amplicons was approximately 5°C which could be promptly detected by all the thermal cyclers. The upgraded assay yielded accurate results with 157pg of genomic DNA per reaction. This optimized technique represents a flexible and inexpensive alternative to the minor grove binder fluorescently labeled method and to high resolution melt analysis for high-throughput, robust and cheap genotyping of single nucleotide variations. Copyright © 2014 Elsevier B.V. All rights reserved.
CD process control through machine learning
NASA Astrophysics Data System (ADS)
Utzny, Clemens
2016-10-01
For the specific requirements of the 14nm and 20nm site applications a new CD map approach was developed at the AMTC. This approach relies on a well established machine learning technique called recursive partitioning. Recursive partitioning is a powerful technique which creates a decision tree by successively testing whether the quantity of interest can be explained by one of the supplied covariates. The test performed is generally a statistical test with a pre-supplied significance level. Once the test indicates significant association between the variable of interest and a covariate a split performed at a threshold value which minimizes the variation within the newly attained groups. This partitioning is recurred until either no significant association can be detected or the resulting sub group size falls below a pre-supplied level.
Logistic regression applied to natural hazards: rare event logistic regression with replications
NASA Astrophysics Data System (ADS)
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Havens: Explicit Reliable Memory Regions for HPC Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Engelmann, Christian
2016-01-01
Supporting error resilience in future exascale-class supercomputing systems is a critical challenge. Due to transistor scaling trends and increasing memory density, scientific simulations are expected to experience more interruptions caused by transient errors in the system memory. Existing hardware-based detection and recovery techniques will be inadequate to manage the presence of high memory fault rates. In this paper we propose a partial memory protection scheme based on region-based memory management. We define the concept of regions called havens that provide fault protection for program objects. We provide reliability for the regions through a software-based parity protection mechanism. Our approach enablesmore » critical program objects to be placed in these havens. The fault coverage provided by our approach is application agnostic, unlike algorithm-based fault tolerance techniques.« less
Microfluidic devices for sample preparation and rapid detection of foodborne pathogens.
Kant, Krishna; Shahbazi, Mohammad-Ali; Dave, Vivek Priy; Ngo, Tien Anh; Chidambara, Vinayaka Aaydha; Than, Linh Quyen; Bang, Dang Duong; Wolff, Anders
2018-03-10
Rapid detection of foodborne pathogens at an early stage is imperative for preventing the outbreak of foodborne diseases, known as serious threats to human health. Conventional bacterial culturing methods for foodborne pathogen detection are time consuming, laborious, and with poor pathogen diagnosis competences. This has prompted researchers to call the current status of detection approaches into question and leverage new technologies for superior pathogen sensing outcomes. Novel strategies mainly rely on incorporating all the steps from sample preparation to detection in miniaturized devices for online monitoring of pathogens with high accuracy and sensitivity in a time-saving and cost effective manner. Lab on chip is a blooming area in diagnosis, which exploits different mechanical and biological techniques to detect very low concentrations of pathogens in food samples. This is achieved through streamlining the sample handling and concentrating procedures, which will subsequently reduce human errors and enhance the accuracy of the sensing methods. Integration of sample preparation techniques into these devices can effectively minimize the impact of complex food matrix on pathogen diagnosis and improve the limit of detections. Integration of pathogen capturing bio-receptors on microfluidic devices is a crucial step, which can facilitate recognition abilities in harsh chemical and physical conditions, offering a great commercial benefit to the food-manufacturing sector. This article reviews recent advances in current state-of-the-art of sample preparation and concentration from food matrices with focus on bacterial capturing methods and sensing technologies, along with their advantages and limitations when integrated into microfluidic devices for online rapid detection of pathogens in foods and food production line. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Kabiri Rahani, Ehsan
Condition based monitoring of Thermal Protection Systems (TPS) is necessary for safe operations of space shuttles when quick turn-around time is desired. In the current research Terahertz radiation (T-ray) has been used to detect mechanical and heat induced damages in TPS tiles. Voids and cracks inside the foam tile are denoted as mechanical damage while property changes due to long and short term exposures of tiles to high heat are denoted as heat induced damage. Ultrasonic waves cannot detect cracks and voids inside the tile because the tile material (silica foam) has high attenuation for ultrasonic energy. Instead, electromagnetic terahertz radiation can easily penetrate into the foam material and detect the internal voids although this electromagnetic radiation finds it difficult to detect delaminations between the foam tile and the substrate plate. Thus these two technologies are complementary to each other for TPS inspection. Ultrasonic and T-ray field modeling in free and mounted tiles with different types of mechanical and thermal damages has been the focus of this research. Shortcomings and limitations of FEM method in modeling 3D problems especially at high-frequencies has been discussed and a newly developed semi-analytical technique called Distributed Point Source Method (DPSM) has been used for this purpose. A FORTRAN code called DPSM3D has been developed to model both ultrasonic and electromagnetic problems using the conventional DPSM method. This code is designed in a general form capable of modeling a variety of geometries. DPSM has been extended from ultrasonic applications to electromagnetic to model THz Gaussian beams, multilayered dielectrics and Gaussian beam-scatterer interaction problems. Since the conventional DPSM has some drawbacks, to overcome it two modification methods called G-DPSM and ESM have been proposed. The conventional DPSM in the past was only capable of solving time harmonic (frequency domain) problems. Time history was obtained by FFT (Fast Fourier Transform) algorithm. In this research DPSM has been extended to model DPSM transient problems without using FFT. This modified technique has been denoted as t-DPSM. Using DPSM, scattering of focused ultrasonic fields by single and multiple cavities in fluid & solid media is studied. It is investigated when two cavities in close proximity can be distinguished and when it is not possible. A comparison between the radiation forces generated by the ultrasonic energies reflected from two small cavities versus a single big cavity is also carried out.
Data Randomization and Cluster-Based Partitioning for Botnet Intrusion Detection.
Al-Jarrah, Omar Y; Alhussein, Omar; Yoo, Paul D; Muhaidat, Sami; Taha, Kamal; Kim, Kwangjo
2016-08-01
Botnets, which consist of remotely controlled compromised machines called bots, provide a distributed platform for several threats against cyber world entities and enterprises. Intrusion detection system (IDS) provides an efficient countermeasure against botnets. It continually monitors and analyzes network traffic for potential vulnerabilities and possible existence of active attacks. A payload-inspection-based IDS (PI-IDS) identifies active intrusion attempts by inspecting transmission control protocol and user datagram protocol packet's payload and comparing it with previously seen attacks signatures. However, the PI-IDS abilities to detect intrusions might be incapacitated by packet encryption. Traffic-based IDS (T-IDS) alleviates the shortcomings of PI-IDS, as it does not inspect packet payload; however, it analyzes packet header to identify intrusions. As the network's traffic grows rapidly, not only the detection-rate is critical, but also the efficiency and the scalability of IDS become more significant. In this paper, we propose a state-of-the-art T-IDS built on a novel randomized data partitioned learning model (RDPLM), relying on a compact network feature set and feature selection techniques, simplified subspacing and a multiple randomized meta-learning technique. The proposed model has achieved 99.984% accuracy and 21.38 s training time on a well-known benchmark botnet dataset. Experiment results demonstrate that the proposed methodology outperforms other well-known machine-learning models used in the same detection task, namely, sequential minimal optimization, deep neural network, C4.5, reduced error pruning tree, and randomTree.
An inexpensive active optical remote sensing instrument for assessing aerosol distributions.
Barnes, John E; Sharma, Nimmi C P
2012-02-01
Air quality studies on a broad variety of topics from health impacts to source/sink analyses, require information on the distributions of atmospheric aerosols over both altitude and time. An inexpensive, simple to implement, ground-based optical remote sensing technique has been developed to assess aerosol distributions. The technique, called CLidar (Charge Coupled Device Camera Light Detection and Ranging), provides aerosol altitude profiles over time. In the CLidar technique a relatively low-power laser transmits light vertically into the atmosphere. The transmitted laser light scatters off of air molecules, clouds, and aerosols. The entire beam from ground to zenith is imaged using a CCD camera and wide-angle (100 degree) optics which are a few hundred meters from the laser. The CLidar technique is optimized for low altitude (boundary layer and lower troposphere) measurements where most aerosols are found and where many other profiling techniques face difficulties. Currently the technique is limited to nighttime measurements. Using the CLidar technique aerosols may be mapped over both altitude and time. The instrumentation required is portable and can easily be moved to locations of interest (e.g. downwind from factories or power plants, near highways). This paper describes the CLidar technique, implementation and data analysis and offers specifics for users wishing to apply the technique for aerosol profiles.
Mahrooghy, Majid; Yarahmadian, Shantia; Menon, Vineetha; Rezania, Vahid; Tuszynski, Jack A
2015-10-01
Microtubules (MTs) are intra-cellular cylindrical protein filaments. They exhibit a unique phenomenon of stochastic growth and shrinkage, called dynamic instability. In this paper, we introduce a theoretical framework for applying Compressive Sensing (CS) to the sampled data of the microtubule length in the process of dynamic instability. To reduce data density and reconstruct the original signal with relatively low sampling rates, we have applied CS to experimental MT lament length time series modeled as a Dichotomous Markov Noise (DMN). The results show that using CS along with the wavelet transform significantly reduces the recovery errors comparing in the absence of wavelet transform, especially in the low and the medium sampling rates. In a sampling rate ranging from 0.2 to 0.5, the Root-Mean-Squared Error (RMSE) decreases by approximately 3 times and between 0.5 and 1, RMSE is small. We also apply a peak detection technique to the wavelet coefficients to detect and closely approximate the growth and shrinkage of MTs for computing the essential dynamic instability parameters, i.e., transition frequencies and specially growth and shrinkage rates. The results show that using compressed sensing along with the peak detection technique and wavelet transform in sampling rates reduces the recovery errors for the parameters. Copyright © 2015 Elsevier Ltd. All rights reserved.
Detection of Lipid and Amphiphilic Biomarkers for Disease Diagnostics
Vu, Dung M.; Mendez, Heather M.; Jakhar, Shailja; Mukundan, Harshini
2017-01-01
Rapid diagnosis is crucial to effectively treating any disease. Biological markers, or biomarkers, have been widely used to diagnose a variety of infectious and non-infectious diseases. The detection of biomarkers in patient samples can also provide valuable information regarding progression and prognosis. Interestingly, many such biomarkers are composed of lipids, and are amphiphilic in biochemistry, which leads them to be often sequestered by host carriers. Such sequestration enhances the difficulty of developing sensitive and accurate sensors for these targets. Many of the physiologically relevant molecules involved in pathogenesis and disease are indeed amphiphilic. This chemical property is likely essential for their biological function, but also makes them challenging to detect and quantify in vitro. In order to understand pathogenesis and disease progression while developing effective diagnostics, it is important to account for the biochemistry of lipid and amphiphilic biomarkers when creating novel techniques for the quantitative measurement of these targets. Here, we review techniques and methods used to detect lipid and amphiphilic biomarkers associated with disease, as well as their feasibility for use as diagnostic targets, highlighting the significance of their biochemical properties in the design and execution of laboratory and diagnostic strategies. The biochemistry of biological molecules is clearly relevant to their physiological function, and calling out the need for consideration of this feature in their study, and use as vaccine, diagnostic and therapeutic targets is the overarching motivation for this review. PMID:28677660
Assessing bat detectability and occupancy with multiple automated echolocation detectors
Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.
2008-01-01
Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.
Wavelet based detection of manatee vocalizations
NASA Astrophysics Data System (ADS)
Gur, Berke M.; Niezrecki, Christopher
2005-04-01
The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of watercraft collisions in Florida's coastal waterways. Several boater warning systems, based upon manatee vocalizations, have been proposed to reduce the number of collisions. Three detection methods based on the Fourier transform (threshold, harmonic content and autocorrelation methods) were previously suggested and tested. In the last decade, the wavelet transform has emerged as an alternative to the Fourier transform and has been successfully applied in various fields of science and engineering including the acoustic detection of dolphin vocalizations. As of yet, no prior research has been conducted in analyzing manatee vocalizations using the wavelet transform. Within this study, the wavelet transform is used as an alternative to the Fourier transform in detecting manatee vocalizations. The wavelet coefficients are analyzed and tested against a specified criterion to determine the existence of a manatee call. The performance of the method presented is tested on the same data previously used in the prior studies, and the results are compared. Preliminary results indicate that using the wavelet transform as a signal processing technique to detect manatee vocalizations shows great promise.
Using Runtime Analysis to Guide Model Checking of Java Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Norvig, Peter (Technical Monitor)
2001-01-01
This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.
Automation of disbond detection in aircraft fuselage through thermal image processing
NASA Technical Reports Server (NTRS)
Prabhu, D. R.; Winfree, W. P.
1992-01-01
A procedure for interpreting thermal images obtained during the nondestructive evaluation of aircraft bonded joints is presented. The procedure operates on time-derivative thermal images and resulted in a disbond image with disbonds highlighted. The size of the 'black clusters' in the output disbond image is a quantitative measure of disbond size. The procedure is illustrated using simulation data as well as data obtained through experimental testing of fabricated samples and aircraft panels. Good results are obtained, and, except in pathological cases, 'false calls' in the cases studied appeared only as noise in the output disbond image which was easily filtered out. The thermal detection technique coupled with an automated image interpretation capability will be a very fast and effective method for inspecting bonded joints in an aircraft structure.
The endpoint detection technique for deep submicrometer plasma etching
NASA Astrophysics Data System (ADS)
Wang, Wei; Du, Zhi-yun; Zeng, Yong; Lan, Zhong-went
2009-07-01
The availability of reliable optical sensor technology provides opportunities to better characterize and control plasma etching processes in real time, they could play a important role in endpoint detection, fault diagnostics and processes feedback control and so on. The optical emission spectroscopy (OES) method becomes deficient in the case of deep submicrometer gate etching. In the newly developed high density inductively coupled plasma (HD-ICP) etching system, Interferometry endpoint (IEP) is introduced to get the EPD. The IEP fringe count algorithm is investigated to predict the end point, and then its signal is used to control etching rate and to call end point with OES signal in over etching (OE) processes step. The experiment results show that IEP together with OES provide extra process control margin for advanced device with thinner gate oxide.
Fault tolerant system based on IDDQ testing
NASA Astrophysics Data System (ADS)
Guibane, Badi; Hamdi, Belgacem; Mtibaa, Abdellatif; Bensalem, Brahim
2018-06-01
Offline test is essential to ensure good manufacturing quality. However, for permanent or transient faults that occur during the use of the integrated circuit in an application, an online integrated test is needed as well. This procedure should ensure the detection and possibly the correction or the masking of these faults. This requirement of self-correction is sometimes necessary, especially in critical applications that require high security such as automotive, space or biomedical applications. We propose a fault-tolerant design for analogue and mixed-signal design complementary metal oxide (CMOS) circuits based on the quiescent current supply (IDDQ) testing. A defect can cause an increase in current consumption. IDDQ testing technique is based on the measurement of power supply current to distinguish between functional and failed circuits. The technique has been an effective testing method for detecting physical defects such as gate-oxide shorts, floating gates (open) and bridging defects in CMOS integrated circuits. An architecture called BICS (Built In Current Sensor) is used for monitoring the supply current (IDDQ) of the connected integrated circuit. If the measured current is not within the normal range, a defect is signalled and the system switches connection from the defective to a functional integrated circuit. The fault-tolerant technique is composed essentially by a double mirror built-in current sensor, allowing the detection of abnormal current consumption and blocks allowing the connection to redundant circuits, if a defect occurs. Spices simulations are performed to valid the proposed design.
NASA Astrophysics Data System (ADS)
Škarková, Pavlína; Novotný, Karel; Lubal, Přemysl; Jebavá, Alžběta; Pořízka, Pavel; Klus, Jakub; Farka, Zdeněk; Hrdlička, Aleš; Kaiser, Jozef
2017-05-01
In this study, the feasibility of Quantum dots (QDs) 2D distribution mapping on the substrate by Laser-Induced Breakdown Spectroscopy (LIBS) was examined. The major objective of this study was to describe phenomena occurring after applying aqueous solutions of QDs onto filtration paper. Especially, the influence of pH and presence of Cu2 + cations in QDs solutions on LIBS signal was investigated. Cadmium Telluride QDs (CdTe QDs) were prepared by formation of nanosized semiconductor particles in so called ;one-pot; synthesis. CdTe QDs were capped by glutathione or by 3-mercaptopropionic acid. The technique described in this work allows detection of QDs injected on the selected substrate - filtration paper. Results obtained from LIBS experiments were collated with a comparative method, fluorescence microscopy, which showed variations in the distribution of QDs on the substrate surface and possibilities for quenching. Due to the immediate signal response, relatively simple instrumentation and automatization possibility, LIBS offers promising and fast alternative to other techniques, as it is able to detect also nanoparticles with no visible luminescence.
A new approach to measuring tortuosity
NASA Astrophysics Data System (ADS)
Wert, Amanda; Scott, Sherry E.
2012-03-01
The detection and measurement of the tortuosity - i.e. the bending and winding - of vessels has been shown to be potentially useful in the assessment of cancer progression and treatment response. Although several metrics for tortuosity are used, no single one measure is able to capture all types of tortuosity. This report presents a new multiscale technique for measuring vessel tortuosity. The approach is based on a method - called the ergodicity defect - which gives a scale-dependent measure of deviation from ergodicity. Ergodicity is a concept that captures the manner in which trajectories or signals sample the space; thus, ergodicity and vessel tortuosity both involve the notion of how a signal samples space. Here we begin to explore this connection. We first apply the ergodicity defect tortuosity measure to both 2D and 3D synthetic data in order to demonstrate the response of the method to three types of tortuosity observed in clinical patterns. We then implement the technique on segmented vessels extracted from brain tumor MRA images. Results indicate that the method can be effectively used to detect and measure several types of vessel tortuosity.
Post-processing of auditory steady-state responses to correct spectral leakage.
Felix, Leonardo Bonato; de Sá, Antonio Mauricio Ferreira Leite Miranda; Mendes, Eduardo Mazoni Andrade Marçal; Moraes, Márcio Flávio Dutra
2009-06-30
Auditory steady-state responses (ASSRs) are electrical manifestations of brain due to high rate sound stimulation. These evoked responses can be used to assess the hearing capabilities of a subject in an objective, automatic fashion. Usually, the detection protocol is accomplished by frequency-domain techniques, such as magnitude-squared coherence, whose estimation is based on the fast Fourier transform (FFT) of several data segments. In practice, the FFT-based spectrum may spread out the energy of a given frequency to its side bins and this escape of energy in the spectrum is called spectral leakage. The distortion of the spectrum due to leakage may severely compromise statistical significance of objective detection. This work presents an offline, a posteriori method for spectral leakage minimization in the frequency-domain analysis of ASSRs using coherent sampling criterion and interpolation in time. The technique was applied to the local field potentials of 10 Wistar rats and the results, together with those from simulated data, indicate that a leakage-free analysis of ASSRs is possible for any dataset if the methods showed in this paper were followed.
Faint source detection in ISOCAM images
NASA Astrophysics Data System (ADS)
Starck, J. L.; Aussel, H.; Elbaz, D.; Fadda, D.; Cesarsky, C.
1999-08-01
We present a tool adapted to the detection of faint mid-infrared sources within ISOCAM mosaics. This tool is based on a wavelet analysis which allows us to discriminate sources from cosmic ray impacts at the very limit of the instrument, four orders of magnitudes below IRAS. It is called PRETI for Pattern REcognition Technique for ISOCAM data, because glitches with transient behaviors are isolated in the wavelet space, i.e. frequency space, where they present peculiar signatures in the form of patterns automatically identified and then reconstructed. We have tested PRETI with Monte-Carlo simulations of fake ISOCAM data. These simulations allowed us to define the fraction of remaining false sources due to cosmic rays, the sensitivity and completeness limits as well as the photometric accuracy as a function of the observation parameters. Although the main scientific applications of this technique have appeared or will appear in separated papers, we present here an application to the ISOCAM-Hubble Deep Field image. This work completes and confirms the results already published (\\cite[Aussel et al. 1999]{starck:aussel99}).
Weak signal detection: A discrete window of opportunity for achieving ‘Vision 90:90:90’?
Burman, Christopher J.; Aphane, Marota; Delobelle, Peter
2016-01-01
Abstract Introduction: UNAIDS’ Vision 90:90:90 is a call to ‘end AIDS’. Developing predictive foresight of the unpredictable changes that this journey will entail could contribute to the ambition of ‘ending AIDS’. There are few opportunities for managing unpredictable changes. We introduce ‘weak signal detection’ as a potential opportunity to fill this void. Method: Combining futures and complexity theory, we reflect on two pilot case studies that involved the Archetype Extraction technique and the SenseMaker® Collector™ tool. Results: Both the piloted techniques have the potentials to surface weak signals – but there is room for improvement. Discussion: A management response to a complex weak signal requires pattern management, rather than an exclusive focus on behaviour management. Conclusion: Weak signal detection is a window of opportunity to improve resilience to unpredictable changes in the HIV/AIDS landscape that can both reduce the risk that emerges from the changes and increase the visibility of opportunities to exploit the unpredictable changes that could contribute to ‘ending AIDS’. PMID:26821952
Stafford, Kathleen M; Mellinger, David K; Moore, Sue E; Fox, Christopher G
2007-12-01
Five species of large whales, including the blue (Balaenoptera musculus), fin (B. physalus), sei (B. borealis), humpback (Megaptera novaeangliae), and North Pacific right (Eubalaena japonica), were the target of commercial harvests in the Gulf of Alaska (GoA) during the 19th through mid-20th Centuries. Since this time, there have been a few summer time visual surveys for these species, but no overview of year-round use of these waters by endangered whales primarily because standard visual survey data are difficult and costly. From October 1999-May 2002, moored hydrophones were deployed in six locations in the GoA to record whale calls. Reception of calls from fin, humpback, and blue whales and an unknown source, called Watkins' whale, showed seasonal and geographic variation. Calls were detected more often during the winter than during the summer, suggesting that animals inhabit the GoA year-round. To estimate the distance at which species-diagnostic calls could be heard, parabolic equation propagation loss models for frequencies characteristic of each of each call type were run. Maximum detection ranges in the subarctic North Pacific ranged from 45 to 250 km among three species (fin, humpback, blue), although modeled detection ranges varied greatly with input parameters and choice of ambient noise level.
Selected Aspects of the eCall Emergency Notification System
NASA Astrophysics Data System (ADS)
Kaminski, Tomasz; Nowacki, Gabriel; Mitraszewska, Izabella; Niezgoda, Michał; Kruszewski, Mikołaj; Kaminska, Ewa; Filipek, Przemysław
2012-02-01
The article describes problems associated with the road collision detection for the purpose of the automatic emergency call. At the moment collision is detected, the eCall device installed in the vehicle will automatically make contact with Emergency Notification Centre and send the set of essential information on the vehicle and the place of the accident. To activate the alarm, the information about the deployment of the airbags will not be used, because connection of the eCall device might interfere with the vehicle’s safety systems. It is necessary to develop a method enabling detection of the road collision, similar to the one used in airbag systems, and based on the signals available from the acceleration sensors.
Helble, Tyler A; D'Spain, Gerald L; Campbell, Greg S; Hildebrand, John A
2013-11-01
This paper demonstrates the importance of accounting for environmental effects on passive underwater acoustic monitoring results. The situation considered is the reduction in shipping off the California coast between 2008-2010 due to the recession and environmental legislation. The resulting variations in ocean noise change the probability of detecting marine mammal vocalizations. An acoustic model was used to calculate the time-varying probability of detecting humpback whale vocalizations under best-guess environmental conditions and varying noise. The uncorrected call counts suggest a diel pattern and an increase in calling over a two-year period; the corrected call counts show minimal evidence of these features.
NASA Astrophysics Data System (ADS)
Aydogan, D.
2007-04-01
An image processing technique called the cellular neural network (CNN) approach is used in this study to locate geological features giving rise to gravity anomalies such as faults or the boundary of two geologic zones. CNN is a stochastic image processing technique based on template optimization using the neighborhood relationships of cells. These cells can be characterized by a functional block diagram that is typical of neural network theory. The functionality of CNN is described in its entirety by a number of small matrices (A, B and I) called the cloning template. CNN can also be considered to be a nonlinear convolution of these matrices. This template describes the strength of the nearest neighbor interconnections in the network. The recurrent perceptron learning algorithm (RPLA) is used in optimization of cloning template. The CNN and standard Canny algorithms were first tested on two sets of synthetic gravity data with the aim of checking the reliability of the proposed approach. The CNN method was compared with classical derivative techniques by applying the cross-correlation method (CC) to the same anomaly map as this latter approach can detect some features that are difficult to identify on the Bouguer anomaly maps. This approach was then applied to the Bouguer anomaly map of Biga and its surrounding area, in Turkey. Structural features in the area between Bandirma, Biga, Yenice and Gonen in the southwest Marmara region are investigated by applying the CNN and CC to the Bouguer anomaly map. Faults identified by these algorithms are generally in accordance with previously mapped surface faults. These examples show that the geologic boundaries can be detected from Bouguer anomaly maps using the cloning template approach. A visual evaluation of the outputs of the CNN and CC approaches is carried out, and the results are compared with each other. This approach provides quantitative solutions based on just a few assumptions, which makes the method more powerful than the classical methods.
Automatic classification of animal vocalizations
NASA Astrophysics Data System (ADS)
Clemins, Patrick J.
2005-11-01
Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.
Avian predators are less abundant during periodical cicada emergences, but why?
Koenig, Walter D; Ries, Leslie; Olsen, V Beth K; Liebhold, Andrew M
2011-03-01
Despite a substantial resource pulse, numerous avian insectivores known to depredate periodical cicadas (Magicicada spp.) are detected less commonly during emergence years than in either the previous or following years. We used data on periodical cicada calls collected by volunteers conducting North American Breeding Bird Surveys within the range of cicada Brood X to test three hypotheses for this observation: lower detection rates could be caused by bird calls being obscured by cicada calls ("detectability" hypothesis), by birds avoiding areas with cicadas ("repel" hypothesis), or because bird abundances are generally lower during emergence years for some reason unrelated to the current emergence event ("true decline" hypothesis). We tested these hypotheses by comparing bird detections at stations coincident with calling cicadas vs. those without calling cicadas in the year prior to and during cicada emergences. At four distinct levels (stop, route, range, and season), parallel declines of birds in groups exposed and not exposed to cicada calls supported the true decline hypothesis. We discuss several potential mechanisms for this pattern, including the possibility that it is a consequence of the ecological and evolutionary interactions between predators of this extraordinary group of insects.
Raman spectroscopy for cancer detection and characterization in metastasis models
NASA Astrophysics Data System (ADS)
Koga, Shigehiro; Oshima, Yusuke; Sato, Mitsunori; Ishimaru, Kei; Yoshida, Motohira; Yamamoto, Yuji; Matsuno, Yusuke; Watanabe, Yuji
2017-02-01
Raman spectroscopy provides a wealth of diagnostic information to the surgeon with in situ cancer detection and label-free histopathology in clinical practice. Raman spectroscopy is a developing optical technique which can analyze biological tissues with light scattering. The difference in frequencies between the incident light and the scattering light are called Raman shifts, which correspond to the vibrational energy of the molecular bonds. Raman spectrum gives information about the molecular structure and composition in biological specimens. We had been previously reported that Raman spectroscopy could distinguish various histological types of human lung cancer cells from normal cells in vitro. However, to identify and detect cancer diagnostic biomarkers in vivo on Raman spectroscopy is still challenging, because malignancy can be characterized not only by the cancer cells but also by the environmental factors including immune cells, stroma cells, secretion vesicles and extracellular matrix. Here we investigate morphological and molecular dynamics in both cancer cells and their environment in xenograft models and spontaneous metastasis models using Raman spectroscopy combined with fluorescence microscopy and photoluminescence imaging. We are also constructing a custom-designed Raman spectral imaging system for both in vitro and in vivo assay of tumor tissues to reveal the metastasis process and to evaluate therapeutic effects of anti-cancer drugs and their drug delivery toward the clinical application of the technique.
Burnette, Dylan T; Sengupta, Prabuddha; Dai, Yuhai; Lippincott-Schwartz, Jennifer; Kachar, Bechara
2011-12-27
Superresolution imaging techniques based on the precise localization of single molecules, such as photoactivated localization microscopy (PALM) and stochastic optical reconstruction microscopy (STORM), achieve high resolution by fitting images of single fluorescent molecules with a theoretical Gaussian to localize them with a precision on the order of tens of nanometers. PALM/STORM rely on photoactivated proteins or photoswitching dyes, respectively, which makes them technically challenging. We present a simple and practical way of producing point localization-based superresolution images that does not require photoactivatable or photoswitching probes. Called bleaching/blinking assisted localization microscopy (BaLM), the technique relies on the intrinsic bleaching and blinking behaviors characteristic of all commonly used fluorescent probes. To detect single fluorophores, we simply acquire a stream of fluorescence images. Fluorophore bleach or blink-off events are detected by subtracting from each image of the series the subsequent image. Similarly, blink-on events are detected by subtracting from each frame the previous one. After image subtractions, fluorescence emission signals from single fluorophores are identified and the localizations are determined by fitting the fluorescence intensity distribution with a theoretical Gaussian. We also show that BaLM works with a spectrum of fluorescent molecules in the same sample. Thus, BaLM extends single molecule-based superresolution localization to samples labeled with multiple conventional fluorescent probes.
Latif, Rabia; Abbas, Haider; Latif, Seemab; Masood, Ashraf
2016-07-01
Security and privacy are the first and foremost concerns that should be given special attention when dealing with Wireless Body Area Networks (WBANs). As WBAN sensors operate in an unattended environment and carry critical patient health information, Distributed Denial of Service (DDoS) attack is one of the major attacks in WBAN environment that not only exhausts the available resources but also influence the reliability of information being transmitted. This research work is an extension of our previous work in which a machine learning based attack detection algorithm is proposed to detect DDoS attack in WBAN environment. However, in order to avoid complexity, no consideration was given to the traceback mechanism. During traceback, the challenge lies in reconstructing the attack path leading to identify the attack source. Among existing traceback techniques, Probabilistic Packet Marking (PPM) approach is the most commonly used technique in conventional IP- based networks. However, since marking probability assignment has significant effect on both the convergence time and performance of a scheme, it is not directly applicable in WBAN environment due to high convergence time and overhead on intermediate nodes. Therefore, in this paper we have proposed a new scheme called Efficient Traceback Technique (ETT) based on Dynamic Probability Packet Marking (DPPM) approach and uses MAC header in place of IP header. Instead of using fixed marking probability, the proposed scheme uses variable marking probability based on the number of hops travelled by a packet to reach the target node. Finally, path reconstruction algorithms are proposed to traceback an attacker. Evaluation and simulation results indicate that the proposed solution outperforms fixed PPM in terms of convergence time and computational overhead on nodes.
NASA Astrophysics Data System (ADS)
Vargas, E.; Cifuentes, A.; Alvarado, S.; Cabrera, H.; Delgado, O.; Calderón, A.; Marín, E.
2018-02-01
Photothermal beam deflection is a well-established technique for measuring thermal diffusivity. In this technique, a pump laser beam generates temperature variations on the surface of the sample to be studied. These variations transfer heat to the surrounding medium, which may be air or any other fluid. The medium in turn experiences a change in the refractive index, which will be proportional to the temperature field on the sample surface when the distance to this surface is small. A probe laser beam will suffer a deflection due to the refractive index periodical changes, which is usually monitored by means of a quadrant photodetector or a similar device aided by lock-in amplification. A linear relationship that arises in this technique is that given by the phase lag of the thermal wave as a function of the distance to a punctual heat source when unidimensional heat diffusion can be guaranteed. This relationship is useful in the calculation of the sample's thermal diffusivity, which can be obtained straightforwardly by the so-called slope method, if the pump beam modulation frequency is well-known. The measurement procedure requires the experimenter to displace the probe beam at a given distance from the heat source, measure the phase lag at that offset, and repeat this for as many points as desired. This process can be quite lengthy in dependence of the number points. In this paper, we propose a detection scheme, which overcomes this limitation and simplifies the experimental setup using a digital camera that substitutes all detection hardware utilizing motion detection techniques and software digital signal lock-in post-processing. In this work, the method is demonstrated using thin metallic filaments as samples.
Vargas, E; Cifuentes, A; Alvarado, S; Cabrera, H; Delgado, O; Calderón, A; Marín, E
2018-02-01
Photothermal beam deflection is a well-established technique for measuring thermal diffusivity. In this technique, a pump laser beam generates temperature variations on the surface of the sample to be studied. These variations transfer heat to the surrounding medium, which may be air or any other fluid. The medium in turn experiences a change in the refractive index, which will be proportional to the temperature field on the sample surface when the distance to this surface is small. A probe laser beam will suffer a deflection due to the refractive index periodical changes, which is usually monitored by means of a quadrant photodetector or a similar device aided by lock-in amplification. A linear relationship that arises in this technique is that given by the phase lag of the thermal wave as a function of the distance to a punctual heat source when unidimensional heat diffusion can be guaranteed. This relationship is useful in the calculation of the sample's thermal diffusivity, which can be obtained straightforwardly by the so-called slope method, if the pump beam modulation frequency is well-known. The measurement procedure requires the experimenter to displace the probe beam at a given distance from the heat source, measure the phase lag at that offset, and repeat this for as many points as desired. This process can be quite lengthy in dependence of the number points. In this paper, we propose a detection scheme, which overcomes this limitation and simplifies the experimental setup using a digital camera that substitutes all detection hardware utilizing motion detection techniques and software digital signal lock-in post-processing. In this work, the method is demonstrated using thin metallic filaments as samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Michael; Nemati, Bijan; Zhai, Chengxing
We present an approach that significantly increases the sensitivity for finding and tracking small and fast near-Earth asteroids (NEAs). This approach relies on a combined use of a new generation of high-speed cameras which allow short, high frame-rate exposures of moving objects, effectively 'freezing' their motion, and a computationally enhanced implementation of the 'shift-and-add' data processing technique that helps to improve the signal-to-noise ratio (SNR) for detection of NEAs. The SNR of a single short exposure of a dim NEA is insufficient to detect it in one frame, but by computationally searching for an appropriate velocity vector, shifting successive framesmore » relative to each other and then co-adding the shifted frames in post-processing, we synthetically create a long-exposure image as if the telescope were tracking the object. This approach, which we call 'synthetic tracking,' enhances the familiar shift-and-add technique with the ability to do a wide blind search, detect, and track dim and fast-moving NEAs in near real time. We discuss also how synthetic tracking improves the astrometry of fast-moving NEAs. We apply this technique to observations of two known asteroids conducted on the Palomar 200 inch telescope and demonstrate improved SNR and 10 fold improvement of astrometric precision over the traditional long-exposure approach. In the past 5 yr, about 150 NEAs with absolute magnitudes H = 28 (∼10 m in size) or fainter have been discovered. With an upgraded version of our camera and a field of view of (28 arcmin){sup 2} on the Palomar 200 inch telescope, synthetic tracking could allow detecting up to 180 such objects per night, including very small NEAs with sizes down to 7 m.« less
Anazawa, Takashi; Yokoi, Takahide; Uchiho, Yuichi
2015-09-01
A simple and highly sensitive technique for laser-induced fluorescence detection on multiple channels in a plastic microchip was developed, and its effectiveness was demonstrated by laser-beam ray-trace simulations and experiments. In the microchip, with refractive index nC, A channels and B channels are arrayed alternately and respectively filled with materials with refractive indexes nA for electrophoresis analysis and nB for laser-beam control. It was shown that a laser beam entering from the side of the channel array traveled straight and irradiated all A channels simultaneously and effectively because the refractive actions by the A and B channels were counterbalanced according to the condition nA < nC < nB. This technique is thus called "side-entry laser-beam zigzag irradiation". As a demonstration of the technique, when nC = 1.53, nA = 1.41, nB = 1.66, and the cross sections of both eight A channels and seven B channels were the same isosceles trapezoids with 97° base angle, laser-beam irradiation efficiency on the eight A channels by the simulations was 89% on average and coefficient of variation was 4.4%. These results are far superior to those achieved by other conventional methods such as laser-beam expansion and scanning. Furthermore, fluorescence intensity on the eight A channels determined by the experiments agreed well with that determined by the simulations. Therefore, highly sensitive and uniform fluorescence detection on eight A channels was achieved. It is also possible to fabricate the microchips at low cost by plastic-injection molding and to make a simple and compact detection system, thereby promoting actual use of the proposed side-entry laser-beam zigzag irradiation in various fields.
Novel optical scanning cryptography using Fresnel telescope imaging.
Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren
2015-07-13
We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.
Huang, David; Swanson, Eric A.; Lin, Charles P.; Schuman, Joel S.; Stinson, William G.; Chang, Warren; Hee, Michael R.; Flotte, Thomas; Gregory, Kenton; Puliafito, Carmen A.; Fujimoto, James G.
2015-01-01
A technique called optical coherence tomography (OCT) has been developed for noninvasive cross-sectional imaging in biological systems. OCT uses low-coherence interferometry to produce a two-dimensional image of optical scattering from internal tissue microstructures in a way that is analogous to ultrasonic pulse-echo imaging. OCT has longitudinal and lateral spatial resolutions of a few micrometers and can detect reflected signals as small as ~10−10 of the incident optical power. Tomographic imaging is demonstrated in vitro in the peripapillary area of the retina and in the coronary artery, two clinically relevant examples that are representative of transparent and turbid media, respectively. PMID:1957169
Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D
2013-09-01
Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.
Contour-Based Corner Detection and Classification by Using Mean Projection Transform
Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein
2014-01-01
Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images. PMID:24590354
Contour-based corner detection and classification by using mean projection transform.
Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein
2014-02-28
Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images.
Weir, L.A.; Royle, J. Andrew; Nanjappa, P.; Jung, R.E.
2005-01-01
One of the most fundamental problems in monitoring animal populations is that of imperfect detection. Although imperfect detection can be modeled, studies examining patterns in occurrence often ignore detection and thus fail to properly partition variation in detection from that of occurrence. In this study, we used anuran calling survey data collected on North American Amphibian Monitoring Program routes in eastern Maryland to investigate factors that influence detection probability and site occupancy for 10 anuran species. In 2002, 17 calling survey routes in eastern Maryland were surveyed to collect environmental and species data nine or more times. To analyze these data, we developed models incorporating detection probability and site occupancy. The results suggest that, for more than half of the 10 species, detection probabilities vary most with season (i.e., day-of-year), air temperature, time, and moon illumination, whereas site occupancy may vary by the amount of palustrine forested wetland habitat. Our results suggest anuran calling surveys should document air temperature, time of night, moon illumination, observer skill, and habitat change over time, as these factors can be important to model-adjusted estimates of site occupancy. Our study represents the first formal modeling effort aimed at developing an analytic assessment framework for NAAMP calling survey data.
Effects of Airgun Sounds on Bowhead Whale Calling Rates: Evidence for Two Behavioral Thresholds
Blackwell, Susanna B.; Nations, Christopher S.; McDonald, Trent L.; Thode, Aaron M.; Mathias, Delphine; Kim, Katherine H.; Greene, Charles R.; Macrander, A. Michael
2015-01-01
In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August–October 2007–2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent. PMID:26039218
Effects of airgun sounds on bowhead whale calling rates: evidence for two behavioral thresholds.
Blackwell, Susanna B; Nations, Christopher S; McDonald, Trent L; Thode, Aaron M; Mathias, Delphine; Kim, Katherine H; Greene, Charles R; Macrander, A Michael
2015-01-01
In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August-October 2007-2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent.
Three Reading Comprehension Strategies: TELLS, Story Mapping, and QARs.
ERIC Educational Resources Information Center
Sorrell, Adrian L.
1990-01-01
Three reading comprehension strategies are presented to assist learning-disabled students: an advance organizer technique called "TELLS Fact or Fiction" used before reading a passage, a schema-based technique called "Story Mapping" used while reading, and a postreading method of categorizing questions called…
ERIC Educational Resources Information Center
Lu, Hui-Chuan; Chu, Yu-Hsin; Chang, Cheng-Yu
2013-01-01
Compared with English learners, Spanish learners have fewer resources for automatic error detection and revision and following the current integrative Computer Assisted Language Learning (CALL), we combined corpus-based approach and CALL to create the System of Error Detection and Revision Suggestion (SEDRS) for learning Spanish. Through…
Identification of Age-Related Macular Degeneration Using OCT Images
NASA Astrophysics Data System (ADS)
Arabi, Punal M., Dr; Krishna, Nanditha; Ashwini, V.; Prathibha, H. M.
2018-02-01
Age-related Macular Degeneration is the most leading retinal disease in the recent years. Macular degeneration occurs when the central portion of the retina, called macula deteriorates. As the deterioration occurs with the age, it is commonly referred as Age-related Macular Degeneration. This disease can be visualized by several imaging modalities such as Fundus imaging technique, Optical Coherence Tomography (OCT) technique and many other. Optical Coherence Tomography is the widely used technique for screening the Age-related Macular Degeneration disease, because it has an ability to detect the very minute changes in the retina. The Healthy and AMD affected OCT images are classified by extracting the Retinal Pigmented Epithelium (RPE) layer of the images using the image processing technique. The extracted layer is sampled, the no. of white pixels in each of the sample is counted and the mean value of the no. of pixels is calculated. The average mean value is calculated for both the Healthy and the AMD affected images and a threshold value is fixed and a decision rule is framed to classify the images of interest. The proposed method showed an accuracy of 75%.
Smith, R J; Weber, T E
2016-11-01
The technique of fiber optic pulsed polarimetry, which provides a distributed (local) measurement of the magnetic field along an optical fiber, has been improved to the point where, for the first time, photocathode based optical detection of backscatter is possible with sub-mm spatial resolutions. This has been realized through the writing of an array of deterministic fiber Bragg gratings along the fiber, a so-called backscatter-tailored optical fiber, producing a 34 000-fold increase in backscatter levels over Rayleigh. With such high backscatter levels, high repetition rate lasers are now sufficiently bright to allow near continuous field sensing in both space and time with field resolutions as low as 0.005 T and as high as 170 T over a ∼mm interval given available fiber materials.
Toward interactive search in remote sensing imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Reid B; Hush, Do; Harvey, Neal
2010-01-01
To move from data to information in almost all science and defense applications requires a human-in-the-loop to validate information products, resolve inconsistencies, and account for incomplete and potentially deceptive sources of information. This is a key motivation for visual analytics which aims to develop techniques that complement and empower human users. By contrast, the vast majority of algorithms developed in machine learning aim to replace human users in data exploitation. In this paper we describe a recently introduced machine learning problem, called rare category detection, which may be a better match to visual analytic environments. We describe a new designmore » criteria for this problem, and present comparisons to existing techniques with both synthetic and real-world datasets. We conclude by describing an application in broad-area search of remote sensing imagery.« less
Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort
Gambin, Tomasz; Akdemir, Zeynep C.; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M.B.; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M.; Eldomery, Mohammad K.; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W.; Boerwinkle, Eric; Beaudet, Arthur L.; Gibbs, Richard A.
2017-01-01
Abstract We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor–Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17–50% of pathogenic CNVs in different disease cohorts where 7.1–11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. PMID:27980096
Detection of baleen whales on an ocean-bottom seismometer array in the Lau Basin
NASA Astrophysics Data System (ADS)
Brodie, D.; Dunn, R.
2011-12-01
Long-term deployment of ocean-bottom seismometer arrays provides a unique opportunity for identifying and tracking whales in a manner not usually possible in biological studies. Large baleen whales emit low frequency (>5Hz) sounds called 'calls' or 'songs' that can be detected on either the hydrophone or vertical channel of the instrument at distances in excess of 50 km. The calls are distinct to individual species and even geographical groups among species, and are thought to serve a variety of purposes. Distinct repeating calls can be automatically identified using matched-filter processing, and whales can be located in a manner similar to that of earthquakes. Many baleen whale species are endangered, and little is known about their geographic distribution, population dynamics, and basic behaviors. The Lau back-arc basin, a tectonically active, elongated basin bounded by volcanic shallows, lies in the southwestern Pacific Ocean between Fiji and Tonga. Although whales are known to exist around Fiji and Tonga, little is understood about the population dynamics and migration patterns throughout the basin. Twenty-nine broadband ocean-bottom seismometers deployed in the basin recorded data for approximately ten months during the years 2009-2010. To date, four species of whales have been identified in the data: Blue (one call type), Humpback (two call types, including long-lasting 'songs'), Bryde's (one call type), and Fin whales (three call types). Three as-yet-unknown call types have also been identified. After the calls were identified, idealized spectrograms of the known calls were matched against the entire data set using an auto-detection algorithm. The auto-detection output provides the number of calls and times of year when each call type was recorded. Based on the results, whales migrate seasonally through the basin with some overlapping of species. Initial results also indicate that different species of whales are more common in some parts of the basin than others, suggesting preferences in water depth and distance to land. In future work, whales will be tracked through the basin using call localization information to illustrate migration patterns of the various species.
Use of a Parabolic Microphone to Detect Hidden Subjects in Search and Rescue.
Bowditch, Nathaniel L; Searing, Stanley K; Thomas, Jeffrey A; Thompson, Peggy K; Tubis, Jacqueline N; Bowditch, Sylvia P
2018-03-01
This study compares a parabolic microphone to unaided hearing in detecting and comprehending hidden callers at ranges of 322 to 2510 m. Eight subjects were placed 322 to 2510 m away from a central listening point. The subjects were concealed, and their calling volume was calibrated. In random order, subjects were asked to call the name of a state for 5 minutes. Listeners with parabolic microphones and others with unaided hearing recorded the direction of the call (detection) and name of the state (comprehension). The parabolic microphone was superior to unaided hearing in both detecting subjects and comprehending their calls, with an effect size (Cohen's d) of 1.58 for detection and 1.55 for comprehension. For each of the 8 hidden subjects, there were 24 detection attempts with the parabolic microphone and 54 to 60 attempts by unaided listeners. At the longer distances (1529-2510 m), the parabolic microphone was better at detecting callers (83% vs 51%; P<0.00001 by χ 2 ) and comprehension (57% vs 12%; P<0.00001). At the shorter distances (322-1190 m), the parabolic microphone offered advantages in detection (100% vs 83%; P=0.000023) and comprehension (86% vs 51%; P<0.00001), although not as pronounced as at the longer distances. Use of a 66-cm (26-inch) parabolic microphone significantly improved detection and comprehension of hidden calling subjects at distances between 322 and 2510 m when compared with unaided hearing. This study supports the use of a parabolic microphone in search and rescue to locate responsive subjects in favorable weather and terrain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Wind Lidar Edge Technique Shuttle Demonstration Mission: Anemos
NASA Technical Reports Server (NTRS)
Leete, Stephen J.; Bundas, David J.; Martino, Anthony J.; Carnahan, Timothy M.; Zukowski, Barbara J.
1998-01-01
A NASA mission is planned to demonstrate the technology for a wind lidar. This will implement the direct detection edge technique. The Anemos instrument will fly on the Space Transportation System (STS), or shuttle, aboard a Hitchhiker bridge. The instrument is being managed by the Goddard Space Flight Center as an in-house build, with science leadership from the GSFC Laboratory for Atmospheres, Mesoscale Atmospheric Processes Branch. During a roughly ten-day mission, the instrument will self calibrate and adjust for launch induced mis-alignments, and perform a campaign of measurements of tropospheric winds. The mission is planned for early 2001. The instrument is being developed under the auspices of NASA's New Millennium Program, in parallel with a comparable mission being managed by the Marshall Space Flight Center. That mission, called SPARCLE, will implement the coherent technique. NASA plans to fly the two missions together on the same shuttle flight, to allow synergy of wind measurements and a direct comparison of performance.
Measuring Young’s modulus the easy way, and tracing the effects of measurement uncertainties
NASA Astrophysics Data System (ADS)
Nunn, John
2015-09-01
The speed of sound in a solid is determined by the density and elasticity of the material. Young’s modulus can therefore be calculated once the density and the speed of sound in the solid are measured. The density can be measured relatively easily, and the speed of sound through a rod can be measured very inexpensively by setting up a longitudinal standing wave and using a microphone to record its frequency. This is a simplified version of a technique called ‘impulse excitation’. It is a good educational technique for school pupils. This paper includes the description and the free provision of custom software to calculate the frequency spectrum of a recorded sound so that the resonant peaks can be readily identified. Discussion on the effect of measurement uncertainties is included to help the more thorough experimental student improve the accuracy of his method. The technique is sensitive enough to be able to detect changes in the elasticity modulus with a temperature change of just a few degrees.
The DCU: the detector control unit for SPICA-SAFARI
NASA Astrophysics Data System (ADS)
Clénet, Antoine; Ravera, Laurent; Bertrand, Bernard; den Hartog, Roland H.; Jackson, Brian D.; van Leeuven, Bert-Joost; van Loon, Dennis; Parot, Yann; Pointecouteau, Etienne; Sournac, Anthony
2014-08-01
IRAP is developing the warm electronic, so called Detector Control Unit" (DCU), in charge of the readout of the SPICA-SAFARI's TES type detectors. The architecture of the electronics used to readout the 3 500 sensors of the 3 focal plane arrays is based on the frequency domain multiplexing technique (FDM). In each of the 24 detection channels the data of up to 160 pixels are multiplexed in frequency domain between 1 and 3:3 MHz. The DCU provides the AC signals to voltage-bias the detectors; it demodulates the detectors data which are readout in the cold by a SQUID; and it computes a feedback signal for the SQUID to linearize the detection chain in order to optimize its dynamic range. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several µs) and with fast signals (i.e. frequency carriers at 3:3 MHz). This digital signal processing is complex and has to be done at the same time for the 3 500 pixels. It thus requires an optimisation of the power consumption. We took the advantage of the relatively reduced science signal bandwidth (i.e. 20 - 40 Hz) to decouple the signal sampling frequency (10 MHz) and the data processing rate. Thanks to this method we managed to reduce the total number of operations per second and thus the power consumption of the digital processing circuit by a factor of 10. Moreover we used time multiplexing techniques to share the resources of the circuit (e.g. a single BBFB module processes 32 pixels). The current version of the firmware is under validation in a Xilinx Virtex 5 FPGA, the final version will be developed in a space qualified digital ASIC. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed the operation of the detection and readout chains requires to properly define more than 17 500 parameters (about 5 parameters per pixel). Thus it is mandatory to work out an automatic procedure to set up these optimal values. We defined a fast algorithm which characterizes the phase correction to be applied by the BBFB firmware and the pixel resonance frequencies. We also defined a technique to define the AC-carrier initial phases in such a way that the amplitude of their sum is minimized (for a better use of the DAC dynamic range).
A New Tool for Quality Control
NASA Technical Reports Server (NTRS)
1988-01-01
Diffracto, Ltd. is now offering a new product inspection system that allows detection of minute flaws previously difficult or impossible to observe. Called D-Sight, it represents a revolutionary technique for inspection of flat or curved surfaces to find such imperfections as dings, dents and waviness. System amplifies defects, making them highly visible to simplify decision making as to corrective measures or to identify areas that need further study. CVA 3000 employs a camera, high intensity lamps and a special reflective screen to produce a D- Sight image of light reflected from a surface. Image is captured and stored in a computerized vision system then analyzed by a computer program. A live image of surface is projected onto a video display and compared with a stored master image to identify imperfections. Localized defects measuring less than 1/1000 of an inch are readily detected.
NOTE: Impedance magnetocardiogram
NASA Astrophysics Data System (ADS)
Kandori, Akihiko; Miyashita, Tsuyoshi; Suzuki, Daisuke; Yokosawa, Koichi; Tsukada, Keiji
2001-02-01
We have developed an impedance magnetocardiogram (IMCG) system to detect the change of magnetic field corresponding to changes in blood volume in the heart. A low magnetic field from the electrical activity of the human heart - the so-called magnetocardiogram (MCG) - can be simultaneously detected by using this system. Because the mechanical and electrical functions in the heart can be monitored by non-invasive and non-contact measurements, it is easy to observe the cardiovascular functions from an accurate sensor position. This system uses a technique to demodulate induced current in a subject. A flux-locked circuit of a superconducting quantum interference device has a wide frequency range (above 1 MHz) because a constant current (40 kHz) is fed through the subject. It is shown for the first time that the system could measure IMCG signals at the same time as MCG signals.
Mapping brain activity in gradient-echo functional MRI using principal component analysis
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Singh, Manbir; Don, Manuel
1997-05-01
The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.
Microwave-based medical diagnosis using particle swarm optimization algorithm
NASA Astrophysics Data System (ADS)
Modiri, Arezoo
This dissertation proposes and investigates a novel architecture intended for microwave-based medical diagnosis (MBMD). Furthermore, this investigation proposes novel modifications of particle swarm optimization algorithm for achieving enhanced convergence performance. MBMD has been investigated through a variety of innovative techniques in the literature since the 1990's and has shown significant promise in early detection of some specific health threats. In comparison to the X-ray- and gamma-ray-based diagnostic tools, MBMD does not expose patients to ionizing radiation; and due to the maturity of microwave technology, it lends itself to miniaturization of the supporting systems. This modality has been shown to be effective in detecting breast malignancy, and hence, this study focuses on the same modality. A novel radiator device and detection technique is proposed and investigated in this dissertation. As expected, hardware design and implementation are of paramount importance in such a study, and a good deal of research, analysis, and evaluation has been done in this regard which will be reported in ensuing chapters of this dissertation. It is noteworthy that an important element of any detection system is the algorithm used for extracting signatures. Herein, the strong intrinsic potential of the swarm-intelligence-based algorithms in solving complicated electromagnetic problems is brought to bear. This task is accomplished through addressing both mathematical and electromagnetic problems. These problems are called benchmark problems throughout this dissertation, since they have known answers. After evaluating the performance of the algorithm for the chosen benchmark problems, the algorithm is applied to MBMD tumor detection problem. The chosen benchmark problems have already been tackled by solution techniques other than particle swarm optimization (PSO) algorithm, the results of which can be found in the literature. However, due to the relatively high level of complexity and randomness inherent to the selection of electromagnetic benchmark problems, a trend to resort to oversimplification in order to arrive at reasonable solutions has been taken in literature when utilizing analytical techniques. Here, an attempt has been made to avoid oversimplification when using the proposed swarm-based optimization algorithms.
Fabrication of Polymer Microspheres for Optical Resonator and Laser Applications.
Yamamoto, Yohei; Okada, Daichi; Kushida, Soh; Ngara, Zakarias Seba; Oki, Osamu
2017-06-02
This paper describes three methods of preparing fluorescent microspheres comprising π-conjugated or non-conjugated polymers: vapor diffusion, interface precipitation, and mini-emulsion. In all methods, well-defined, micrometer-sized spheres are obtained from a self-assembling process in solution. The vapor diffusion method can result in spheres with the highest sphericity and surface smoothness, yet the types of the polymers able to form these spheres are limited. On the other hand, in the mini-emulsion method, microspheres can be made from various types of polymers, even from highly crystalline polymers with coplanar, π-conjugated backbones. The photoluminescent (PL) properties from single isolated microspheres are unusual: the PL is confined inside the spheres, propagates at the circumference of the spheres via the total internal reflection at the polymer/air interface, and self-interferes to show sharp and periodic resonant PL lines. These resonating modes are so-called "whispering gallery modes" (WGMs). This work demonstrates how to measure WGM PL from single isolated spheres using the micro-photoluminescence (µ-PL) technique. In this technique, a focused laser beam irradiates a single microsphere, and the luminescence is detected by a spectrometer. A micromanipulation technique is then used to connect the microspheres one by one and to demonstrate the intersphere PL propagation and color conversion from coupled microspheres upon excitation at the perimeter of one sphere and detection of PL from the other microsphere. These techniques, µ-PL and micromanipulation, are useful for experiments on micro-optic application using polymer materials.
Ozone and Aerosol Retrieval from Backscattered Ultraviolet Radiation
NASA Technical Reports Server (NTRS)
Bhartia, Pawan K.
2012-01-01
In this presentation we will discuss the techniques to estimate total column ozone and aerosol absorption optical depth from the measurements of back scattered ultraviolet (buv) radiation. The total ozone algorithm has been used to create a unique record of the ozone layer, spanning more than 3 decades, from a series of instruments (BUV, SBUV, TOMS, SBUV/2) flown on NASA, NOAA, Japanese and Russian satellites. We will discuss how this algorithm can be considered a generalization of the well-known Dobson/Brewer technique that has been used to process data from ground-based instruments for many decades, and how it differs from the DOAS techniques that have been used to estimate vertical column densities of a host of trace gases from data collected by GOME and SCIAMACHY instruments. The buv aerosol algorithm is most suitable for the detection of UV absorbing aerosols (smoke, desert dust, volcanic ash) and is the only technique that can detect aerosols embedded in clouds. This algorithm has been used to create a quarter century record of aerosol absorption optical depth using the buv data collected by a series of TOMS instruments. We will also discuss how the data from the OMI instrument launched on July 15, 2004 will be combined with data from MODIS and CALIPSO lidar data to enhance the accuracy and information content of satellite-derived aerosol measurements. The OMI and MODIS instruments are currently flying on EOS Aura and EOS Aqua satellites respectively, part of a constellation of satellites called the "A-train".
Estimating Lion Abundance using N-mixture Models for Social Species
Belant, Jerrold L.; Bled, Florent; Wilton, Clay M.; Fyumagwa, Robert; Mwampeta, Stanslaus B.; Beyer, Dean E.
2016-01-01
Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170–551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species. PMID:27786283
Estimating Lion Abundance using N-mixture Models for Social Species.
Belant, Jerrold L; Bled, Florent; Wilton, Clay M; Fyumagwa, Robert; Mwampeta, Stanslaus B; Beyer, Dean E
2016-10-27
Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170-551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species.
Poliva, Oren
2017-01-01
In the brain of primates, the auditory cortex connects with the frontal lobe via the temporal pole (auditory ventral stream; AVS) and via the inferior parietal lobe (auditory dorsal stream; ADS). The AVS is responsible for sound recognition, and the ADS for sound-localization, voice detection and integration of calls with faces. I propose that the primary role of the ADS in non-human primates is the detection and response to contact calls. These calls are exchanged between tribe members (e.g., mother-offspring) and are used for monitoring location. Detection of contact calls occurs by the ADS identifying a voice, localizing it, and verifying that the corresponding face is out of sight. Once a contact call is detected, the primate produces a contact call in return via descending connections from the frontal lobe to a network of limbic and brainstem regions. Because the ADS of present day humans also performs speech production, I further propose an evolutionary course for the transition from contact call exchange to an early form of speech. In accordance with this model, structural changes to the ADS endowed early members of the genus Homo with partial vocal control. This development was beneficial as it enabled offspring to modify their contact calls with intonations for signaling high or low levels of distress to their mother. Eventually, individuals were capable of participating in yes-no question-answer conversations. In these conversations the offspring emitted a low-level distress call for inquiring about the safety of objects (e.g., food), and his/her mother responded with a high- or low-level distress call to signal approval or disapproval of the interaction. Gradually, the ADS and its connections with brainstem motor regions became more robust and vocal control became more volitional. Speech emerged once vocal control was sufficient for inventing novel calls. PMID:28928931
Ultrahigh-resolution CT and DR scanner
NASA Astrophysics Data System (ADS)
DiBianca, Frank A.; Gupta, Vivek; Zou, Ping; Jordan, Lawrence M.; Laughter, Joseph S.; Zeman, Herbert D.; Sebes, Jeno I.
1999-05-01
A new technique called Variable-Resolution X-ray (VRX) detection that dramatically increases the spatial resolution in computed tomography (CT) and digital radiography (DR) is presented. The technique is based on a principle called 'projective compression' that allows the resolution element of a CT detector to scale with the subject or field size. For very large (40 - 50 cm) field sizes, resolution exceeding 2 cy/mm is possible and for very small fields, microscopy is attainable with resolution exceeding 100 cy/mm. Several effects that could limit the performance of VRX detectors are considered. Experimental measurements on a 16-channel, CdWO4 scintillator + photodiode test array yield a limiting MTF of 64 cy/mm (8(mu) ) in the highest-resolution configuration reported. Preliminary CT images have been made of small anatomical specimens and small animals using a storage phosphor screen in the VRX mode. Measured detector resolution of the CT projection data exceeds 20 cy/mm (less than 25 (mu) ); however, the final, reconstructed CT images produced thus far exhibit 10 cy/mm (50 (mu) ) resolution because of non-flatness of the storage phosphor plates, focal spot effects and the use of a rudimentary CT reconstruction algorithm. A 576-channel solid-state detector is being fabricated that is expected to achieve CT image resolution in excess of that of the 26-channel test array.
Scaling of echolocation call parameters in bats.
Jones, G
1999-12-01
I investigated the scaling of echolocation call parameters (frequency, duration and repetition rate) in bats in a functional context. Low-duty-cycle bats operate with search phase cycles of usually less than 20 %. They process echoes in the time domain and are therefore intolerant of pulse-echo overlap. High-duty-cycle (>30 %) species use Doppler shift compensation, and they separate pulse and echo in the frequency domain. Call frequency scales negatively with body mass in at least five bat families. Pulse duration scales positively with mass in low-duty-cycle quasi-constant-frequency (QCF) species because the large aerial-hawking species that emit these signals fly fast in open habitats. They therefore detect distant targets and experience pulse-echo overlap later than do smaller bats. Pulse duration also scales positively with mass in the Hipposideridae, which show at least partial Doppler shift compensation. Pulse repetition rate corresponds closely with wingbeat frequency in QCF bat species that fly relatively slowly. Larger, fast-flying species often skip pulses when detecting distant targets. There is probably a trade-off between call intensity and repetition rate because 'whispering' bats (and hipposiderids) produce several calls per predicted wingbeat and because batches of calls are emitted per wingbeat during terminal buzzes. Severe atmospheric attenuation at high frequencies limits the range of high-frequency calls. Low-duty-cycle bats that call at high frequencies must therefore use short pulses to avoid pulse-echo overlap. Rhinolophids escape this constraint by Doppler shift compensation and, importantly, can exploit advantages associated with the emission of both high-frequency and long-duration calls. Low frequencies are unsuited for the detection of small prey, and low repetition rates may limit prey detection rates. Echolocation parameters may therefore constrain maximum body size in aerial-hawking bats.
Detecting Damage in Composite Material Using Nonlinear Elastic Wave Spectroscopy Methods
NASA Astrophysics Data System (ADS)
Meo, Michele; Polimeno, Umberto; Zumpano, Giuseppe
2008-05-01
Modern aerospace structures make increasing use of fibre reinforced plastic composites, due to their high specific mechanical properties. However, due to their brittleness, low velocity impact can cause delaminations beneath the surface, while the surface may appear to be undamaged upon visual inspection. Such damage is called barely visible impact damage (BVID). Such internal damages lead to significant reduction in local strengths and ultimately could lead to catastrophic failures. It is therefore important to detect and monitor damages in high loaded composite components to receive an early warning for a well timed maintenance of the aircraft. Non-linear ultrasonic spectroscopy methods are promising damage detection and material characterization tools. In this paper, two different non-linear elastic wave spectroscopy (NEWS) methods are presented: single mode nonlinear resonance ultrasound (NRUS) and nonlinear wave modulation technique (NWMS). The NEWS methods were applied to detect delamination damage due to low velocity impact (<12 J) on various composite plates. The results showed that the proposed methodology appear to be highly sensitive to the presence of damage with very promising future NDT and structural health monitoring applications.
Anomaly Monitoring Method for Key Components of Satellite
Fan, Linjun; Xiao, Weidong; Tang, Jun
2014-01-01
This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703
An x ray scatter approach for non-destructive chemical analysis of low atomic numbered elements
NASA Technical Reports Server (NTRS)
Ross, H. Richard
1993-01-01
A non-destructive x-ray scatter (XRS) approach has been developed, along with a rapid atomic scatter algorithm for the detection and analysis of low atomic-numbered elements in solids, powders, and liquids. The present method of energy dispersive x-ray fluorescence spectroscopy (EDXRF) makes the analysis of light elements (i.e., less than sodium; less than 11) extremely difficult. Detection and measurement become progressively worse as atomic numbers become smaller, due to a competing process called 'Auger Emission', which reduces fluorescent intensity, coupled with the high mass absorption coefficients exhibited by low energy x-rays, the detection and determination of low atomic-numbered elements by x-ray spectrometry is limited. However, an indirect approach based on the intensity ratio of Compton and Rayleigh scattered has been used to define light element components in alloys, plastics and other materials. This XRS technique provides qualitative and quantitative information about the overall constituents of a variety of samples.
Fast Coherent Differential Imaging for Exoplanet Imaging
NASA Astrophysics Data System (ADS)
Gerard, Benjamin; Marois, Christian; Galicher, Raphael; Veran, Jean-Pierre; Macintosh, B.; Guyon, O.; Lozi, J.; Pathak, P.; Sahoo, A.
2018-06-01
Direct detection and detailed characterization of exoplanets using extreme adaptive optics (ExAO) is a key science goal of future extremely large telescopes and space observatories. However, quasi-static wavefront errors will limit the sensitivity of this endeavor. Additional limitations for ground-based telescopes arise from residual AO-corrected atmospheric wavefront errors, generating short-lived aberrations that will average into a halo over a long exposure, also limiting the sensitivity of exoplanet detection. We develop the framework for a solution to both of these problems using the self-coherent camera (SCC), to be applied to ground-based telescopes, called Fast Atmospheric SCC Technique (FAST). Simulations show that for typical ExAO targets the FAST approach can reach ~100 times better in raw contrast than what is currently achieved with ExAO instruments if we extrapolate for an hour of observing time, illustrating that the sensitivity improvement from this method could play an essential role in the future ground-based detection and characterization of lower mass/colder exoplanets.
High-speed railway real-time localization auxiliary method based on deep neural network
NASA Astrophysics Data System (ADS)
Chen, Dongjie; Zhang, Wensheng; Yang, Yang
2017-11-01
High-speed railway intelligent monitoring and management system is composed of schedule integration, geographic information, location services, and data mining technology for integration of time and space data. Assistant localization is a significant submodule of the intelligent monitoring system. In practical application, the general access is to capture the image sequences of the components by using a high-definition camera, digital image processing technique and target detection, tracking and even behavior analysis method. In this paper, we present an end-to-end character recognition method based on a deep CNN network called YOLO-toc for high-speed railway pillar plate number. Different from other deep CNNs, YOLO-toc is an end-to-end multi-target detection framework, furthermore, it exhibits a state-of-art performance on real-time detection with a nearly 50fps achieved on GPU (GTX960). Finally, we realize a real-time but high-accuracy pillar plate number recognition system and integrate natural scene OCR into a dedicated classification YOLO-toc model.
NASA Astrophysics Data System (ADS)
Tomàs-Buliart, Joan; Fernández, Marcel; Soriano, Miguel
Critical infrastructures are usually controlled by software entities. To monitor the well-function of these entities, a solution based in the use of mobile agents is proposed. Some proposals to detect modifications of mobile agents, as digital signature of code, exist but they are oriented to protect software against modification or to verify that an agent have been executed correctly. The aim of our proposal is to guarantee that the software is being executed correctly by a non trusted host. The way proposed to achieve this objective is by the improvement of the Self-Validating Branch-Based Software Watermarking by Myles et al.. The proposed modification is the incorporation of an external element called sentinel which controls branch targets. This technique applied in mobile agents can guarantee the correct operation of an agent or, at least, can detect suspicious behaviours of a malicious host during the execution of the agent instead of detecting when the execution of the agent have finished.
NASA Technical Reports Server (NTRS)
Cockrum, R. H.
1982-01-01
One method being used to determine energy level(s) and electrical activity of impurities in silicon is described. The method is called capacitance transient spectroscopy (CTS). It can be classified into three basic categories: the thermally stimulated capacitance method, the voltage-stimulated capacitance method, and the light-stimulated capacitance method; the first two categories are discussed. From the total change in capacitance and the time constant of the capacitance response, emission rates, energy levels, and trap concentrations can be determined. A major advantage of using CTS is its ability to detect the presence of electrically active impurities that are invisible to other techniques, such as Zeeman effect atomic absorption, and the ability to detect more than one electrically active impurity in a sample. Examples of detection of majority and minority carrier traps from gold donor and acceptor centers in silicon using the capacitance transient spectrometer are given to illustrate the method and its sensitivity.
REVIEWS OF TOPICAL PROBLEMS: Relic gravitational waves and cosmology
NASA Astrophysics Data System (ADS)
Grishchuk, Leonid P.
2005-12-01
The paper begins with a brief recollection of interactions of the author with Ya B Zeldovich in the context of the study of relic gravitational waves. The principles and early results on the quantum-mechanical generation of cosmological perturbations are then summarized. The expected amplitudes of relic gravitational waves differ in various frequency windows, and therefore the techniques and prospects of their detection are distinct. One section of the paper describes the present state of efforts in direct detection of relic gravitational waves. Another section is devoted to indirect detection via the anisotropy and polarization measurements of the cosmic microwave background (CMB) radiation. It is emphasized throughout the paper that the inference about the existence and expected amount of relic gravitational waves is based on a solid theoretical foundation and the best available cosmological observations. It is also explained in great detail what went wrong with the so-called 'inflationary gravitational waves', whose amount is predicted by inflationary theorists to be negligibly small, thus depriving them of any observational significance.
Environmental Influences On Diel Calling Behavior In Baleen Whales
2015-09-30
and calm seas were infrequent and short (Figure 1b), making traditional shipboard marine mammal observations difficult. The real time detection...first use of real-time detection and reporting of marine mammal calls from autonomous underwater vehicles to adaptively plan research activities. 3...conferences: • 6th International Workshop on Detection, Classification, Localization, and Density Estimation (DCLDE) of Marine Mammals using
Brodie, Dana C; Dunn, Robert A
2015-01-01
Ten months of broadband seismic data, recorded on six ocean-bottom seismographs located in the Lau Basin, were examined to identify baleen whale species. As the first systematic survey of baleen whales in this part of the southwest Pacific Ocean, this study reveals the variety of species present and their temporal occurrence in and near the basin. Baleen whales produce species-specific low frequency calls that can be identified by distinct patterns in data spectrograms. By matching spectrograms with published accounts, fin, Bryde's, Antarctic blue, and New Zealand blue whale calls were identified. Probable whale sounds that could not be matched to published spectrograms, as well as non-biologic sounds that are likely of volcanogenic origin, were also recorded. Detections of fin whale calls (mid-June to mid-October) and blue whale calls (June through September) suggest that these species migrate through the region seasonally. Detections of Bryde's whale calls (primarily February to June, but also other times of the year) suggest this species resides around the basin nearly year round. The discovery of previously unpublished call types emphasizes the limited knowledge of the full call repertoires of baleen whales and the utility of using seismic survey data to enhance understanding in understudied regions.
An aerial-hawking bat uses stealth echolocation to counter moth hearing.
Goerlitz, Holger R; ter Hofstede, Hannah M; Zeale, Matt R K; Jones, Gareth; Holderied, Marc W
2010-09-14
Ears evolved in many nocturnal insects, including some moths, to detect bat echolocation calls and evade capture [1, 2]. Although there is evidence that some bats emit echolocation calls that are inconspicuous to eared moths, it is difficult to determine whether this was an adaptation to moth hearing or originally evolved for a different purpose [2, 3]. Aerial-hawking bats generally emit high-amplitude echolocation calls to maximize detection range [4, 5]. Here we present the first example of an echolocation counterstrategy to overcome prey hearing at the cost of reduced detection distance. We combined comparative bat flight-path tracking and moth neurophysiology with fecal DNA analysis to show that the barbastelle, Barbastella barbastellus, emits calls that are 10 to 100 times lower in amplitude than those of other aerial-hawking bats, remains undetected by moths until close, and captures mainly eared moths. Model calculations demonstrate that only bats emitting such low-amplitude calls hear moth echoes before their calls are conspicuous to moths. This stealth echolocation allows the barbastelle to exploit food resources that are difficult to catch for other aerial-hawking bats emitting calls of greater amplitude. Copyright © 2010 Elsevier Ltd. All rights reserved.
Detection of Lipid and Amphiphilic Biomarkers for Disease Diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kubicek-Sutherland, Jessica Z.; Vu, Dung M.; Mendez, Heather M.
Rapid diagnosis is crucial to effectively treating any disease. Biological markers, or biomarkers, have been widely used to diagnose a variety of infectious and non-infectious diseases. The detection of biomarkers in patient samples can also provide valuable information regarding progression and prognosis. Interestingly, many such biomarkers are composed of lipids, and are amphiphilic in biochemistry, which leads them to be often sequestered by host carriers. Such sequestration enhances the difficulty of developing sensitive and accurate sensors for these targets. Many of the physiologically relevant molecules involved in pathogenesis and disease are indeed amphiphilic. This chemical property is likely essential formore » their biological function, but also makes them challenging to detect and quantify in vitro. In order to understand pathogenesis and disease progression while developing effective diagnostics, it is important to account for the biochemistry of lipid and amphiphilic biomarkers when creating novel techniques for the quantitative measurement of these targets. Here, we review techniques and methods used to detect lipid and amphiphilic biomarkers associated with disease, as well as their feasibility for use as diagnostic targets, highlighting the significance of their biochemical properties in the design and execution of laboratory and diagnostic strategies. Furthermore, the biochemistry of biological molecules is clearly relevant to their physiological function, and calling out the need for consideration of this feature in their study, and use as vaccine, diagnostic and therapeutic targets is the overarching motivation for this review.« less
Detection of Lipid and Amphiphilic Biomarkers for Disease Diagnostics
Kubicek-Sutherland, Jessica Z.; Vu, Dung M.; Mendez, Heather M.; ...
2017-07-04
Rapid diagnosis is crucial to effectively treating any disease. Biological markers, or biomarkers, have been widely used to diagnose a variety of infectious and non-infectious diseases. The detection of biomarkers in patient samples can also provide valuable information regarding progression and prognosis. Interestingly, many such biomarkers are composed of lipids, and are amphiphilic in biochemistry, which leads them to be often sequestered by host carriers. Such sequestration enhances the difficulty of developing sensitive and accurate sensors for these targets. Many of the physiologically relevant molecules involved in pathogenesis and disease are indeed amphiphilic. This chemical property is likely essential formore » their biological function, but also makes them challenging to detect and quantify in vitro. In order to understand pathogenesis and disease progression while developing effective diagnostics, it is important to account for the biochemistry of lipid and amphiphilic biomarkers when creating novel techniques for the quantitative measurement of these targets. Here, we review techniques and methods used to detect lipid and amphiphilic biomarkers associated with disease, as well as their feasibility for use as diagnostic targets, highlighting the significance of their biochemical properties in the design and execution of laboratory and diagnostic strategies. Furthermore, the biochemistry of biological molecules is clearly relevant to their physiological function, and calling out the need for consideration of this feature in their study, and use as vaccine, diagnostic and therapeutic targets is the overarching motivation for this review.« less
Echolocation calls of Poey's flower bat (Phyllonycteris poeyi) unlike those of other phyllostomids.
Mora, Emanuel C; Macías, Silvio
2007-05-01
Unlike any other foraging phyllostomid bat studied to date, Poey's flower bats (Phyllonycteris poeyi-Phyllostomidae) emit relatively long (up to 7.2 ms), intense, single-harmonic echolocation calls. These calls are readily detectable at distances of at least 15 m. Furthermore, the echolocation calls contain only the first harmonic, which is usually filtered out in the vocal tract of phyllostomids. The foraging echolocation calls of P. poeyi are more like search-phase echolocation calls of sympatric aerial-feeding bats (Molossidae, Vespertilionidae, Mormoopidae). Intense, long, narrowband, single-harmonic echolocation calls focus acoustic energy maximizing range and favoring detection, which may be particularly important for cruising bats, like P. poeyi, when flying in the open. Flying in enclosed spaces, P. poeyi emit short, low-intensity, frequency-modulated, multiharmonic echolocation calls typical of other phyllostomids. This is the first report of a phyllostomid species emitting long, intense, single-harmonic echolocation calls with most energy in the first harmonic.
Effect of temporal and spectral noise features on gap detection behavior by calling green treefrogs.
Höbel, Gerlinde
2014-10-01
Communication plays a central role in the behavioral ecology of many animals, yet the background noise generated by large breeding aggregations may impair effective communication. A common behavioral strategy to ameliorate noise interference is gap detection, where signalers display primarily during lulls in the background noise. When attempting gap detection, signalers have to deal with the fact that the spacing and duration of silent gaps is often unpredictable, and that noise varies in its spectral composition and may thus vary in the degree in which it impacts communication. I conducted playback experiments to examine how male treefrogs deal with the problem that refraining from calling while waiting for a gap to appear limits a male's ability to attract females, yet producing calls during noise also interferes with effective sexual communication. I found that the temporal structure of noise (i.e., duration of noise and silent gap segments) had a stronger effect on male calling behavior than the spectral composition. Males placed calls predominantly during silent gaps and avoided call production during short, but not long, noise segments. This suggests that male treefrogs use a calling strategy that maximizes the production of calls without interference, yet allows for calling to persist if lulls in the background noise are infrequent. Copyright © 2014 Elsevier B.V. All rights reserved.
Luo, Jinhong; Koselj, Klemen; Zsebők, Sándor; Siemers, Björn M.; Goerlitz, Holger R.
2014-01-01
Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey. PMID:24335559
Luo, Jinhong; Koselj, Klemen; Zsebok, Sándor; Siemers, Björn M; Goerlitz, Holger R
2014-02-06
Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey.
NASA Astrophysics Data System (ADS)
Yang, C. H.; Kenduiywo, B. K.; Soergel, U.
2016-06-01
Persistent Scatterer Interferometry (PSI) is a technique to detect a network of extracted persistent scatterer (PS) points which feature temporal phase stability and strong radar signal throughout time-series of SAR images. The small surface deformations on such PS points are estimated. PSI particularly works well in monitoring human settlements because regular substructures of man-made objects give rise to large number of PS points. If such structures and/or substructures substantially alter or even vanish due to big change like construction, their PS points are discarded without additional explorations during standard PSI procedure. Such rejected points are called big change (BC) points. On the other hand, incoherent change detection (ICD) relies on local comparison of multi-temporal images (e.g. image difference, image ratio) to highlight scene modifications of larger size rather than detail level. However, image noise inevitably degrades ICD accuracy. We propose a change detection approach based on PSI to synergize benefits of PSI and ICD. PS points are extracted by PSI procedure. A local change index is introduced to quantify probability of a big change for each point. We propose an automatic thresholding method adopting change index to extract BC points along with a clue of the period they emerge. In the end, PS ad BC points are integrated into a change detection image. Our method is tested at a site located around north of Berlin main station where steady, demolished, and erected building substructures are successfully detected. The results are consistent with ground truth derived from time-series of aerial images provided by Google Earth. In addition, we apply our technique for traffic infrastructure, business district, and sports playground monitoring.
NASA Astrophysics Data System (ADS)
Alkilani, Amjad; Shirkhodaie, Amir
2013-05-01
Handling, manipulation, and placement of objects, hereon called Human-Object Interaction (HOI), in the environment generate sounds. Such sounds are readily identifiable by the human hearing. However, in the presence of background environment noises, recognition of minute HOI sounds is challenging, though vital for improvement of multi-modality sensor data fusion in Persistent Surveillance Systems (PSS). Identification of HOI sound signatures can be used as precursors to detection of pertinent threats that otherwise other sensor modalities may miss to detect. In this paper, we present a robust method for detection and classification of HOI events via clustering of extracted features from training of HOI acoustic sound waves. In this approach, salient sound events are preliminary identified and segmented from background via a sound energy tracking method. Upon this segmentation, frequency spectral pattern of each sound event is modeled and its features are extracted to form a feature vector for training. To reduce dimensionality of training feature space, a Principal Component Analysis (PCA) technique is employed to expedite fast classification of test feature vectors, a kd-tree and Random Forest classifiers are trained for rapid classification of training sound waves. Each classifiers employs different similarity distance matching technique for classification. Performance evaluations of classifiers are compared for classification of a batch of training HOI acoustic signatures. Furthermore, to facilitate semantic annotation of acoustic sound events, a scheme based on Transducer Mockup Language (TML) is proposed. The results demonstrate the proposed approach is both reliable and effective, and can be extended to future PSS applications.
Byoun, Mun Sub; Yoo, Changhoon; Sim, Sang Jun; Lim, Chae Seung; Kim, Sung Woo
2018-01-01
Real-time PCR, also called quantitative PCR (qPCR), has been powerful analytical tool for detection of nucleic acids since it developed. Not only for biological research but also for diagnostic needs, qPCR technique requires capacity to detect multiple genes in recent years. Solid phase PCR (SP-PCR) where one or two directional primers are immobilized on solid substrates could analyze multiplex genetic targets. However, conventional SP-PCR was subjected to restriction of application for lack of PCR efficiency and quantitative resolution. Here we introduce an advanced qPCR with primer-incorporated network (PIN). One directional primers are immobilized in the porous hydrogel particle by covalent bond and the other direction of primers are temporarily immobilized at so-called 'Supplimers'. Supplimers released the primers to aqueous phase in the hydrogel at the thermal cycling of PCR. It induced the high PCR efficiency over 92% with high reliability. It reduced the formation of primer dimers and improved the selectivity of qPCR thanks to the strategy of 'right primers supplied to right place only'. By conducting a six-plex qPCR of 30 minutes, we analyzed DNA samples originated from malaria patients and successfully identified malaria species in a single reaction. PMID:29293604
Tzvetkov, Mladen V; Becker, Christian; Kulle, Bettina; Nürnberg, Peter; Brockmöller, Jürgen; Wojnowski, Leszek
2005-02-01
Whole-genome DNA amplification by multiple displacement (MD-WGA) is a promising tool to obtain sufficient DNA amounts from samples of limited quantity. Using Affymetrix' GeneChip Human Mapping 10K Arrays, we investigated the accuracy and allele amplification bias in DNA samples subjected to MD-WGA. We observed an excellent concordance (99.95%) between single-nucleotide polymorphisms (SNPs) called both in the nonamplified and the corresponding amplified DNA. This concordance was only 0.01% lower than the intra-assay reproducibility of the genotyping technique used. However, MD-WGA failed to amplify an estimated 7% of polymorphic loci. Due to the algorithm used to call genotypes, this was detected only for heterozygous loci. We achieved a 4.3-fold reduction of noncalled SNPs by combining the results from two independent MD-WGA reactions. This indicated that inter-reaction variations rather than specific chromosomal loci reduced the efficiency of MD-WGA. Consistently, we detected no regions of reduced amplification, with the exception of several SNPs located near chromosomal ends. Altogether, despite a substantial loss of polymorphic sites, MD-WGA appears to be the current method of choice to amplify genomic DNA for array-based SNP analyses. The number of nonamplified loci can be substantially reduced by amplifying each DNA sample in duplicate.
Detection of inter-frame forgeries in digital videos.
K, Sitara; Mehtre, B M
2018-05-26
Videos are acceptable as evidence in the court of law, provided its authenticity and integrity are scientifically validated. Videos recorded by surveillance systems are susceptible to malicious alterations of visual content by perpetrators locally or remotely. Such malicious alterations of video contents (called video forgeries) are categorized into inter-frame and intra-frame forgeries. In this paper, we propose inter-frame forgery detection techniques using tamper traces from spatio-temporal and compressed domains. Pristine videos containing frames that are recorded during sudden camera zooming event, may get wrongly classified as tampered videos leading to an increase in false positives. To address this issue, we propose a method for zooming detection and it is incorporated in video tampering detection. Frame shuffling detection, which was not explored so far is also addressed in our work. Our method is capable of differentiating various inter-frame tamper events and its localization in the temporal domain. The proposed system is tested on 23,586 videos of which 2346 are pristine and rest of them are candidates of inter-frame forged videos. Experimental results show that we have successfully detected frame shuffling with encouraging accuracy rates. We have achieved improved accuracy on forgery detection in frame insertion, frame deletion and frame duplication. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Liu, Shaoying; King, Michael A.; Brill, Aaron B.; Stabin, Michael G.; Farncombe, Troy H.
2008-02-01
Monte Carlo (MC) is a well-utilized tool for simulating photon transport in single photon emission computed tomography (SPECT) due to its ability to accurately model physical processes of photon transport. As a consequence of this accuracy, it suffers from a relatively low detection efficiency and long computation time. One technique used to improve the speed of MC modeling is the effective and well-established variance reduction technique (VRT) known as forced detection (FD). With this method, photons are followed as they traverse the object under study but are then forced to travel in the direction of the detector surface, whereby they are detected at a single detector location. Another method, called convolution-based forced detection (CFD), is based on the fundamental idea of FD with the exception that detected photons are detected at multiple detector locations and determined with a distance-dependent blurring kernel. In order to further increase the speed of MC, a method named multiple projection convolution-based forced detection (MP-CFD) is presented. Rather than forcing photons to hit a single detector, the MP-CFD method follows the photon transport through the object but then, at each scatter site, forces the photon to interact with a number of detectors at a variety of angles surrounding the object. This way, it is possible to simulate all the projection images of a SPECT simulation in parallel, rather than as independent projections. The result of this is vastly improved simulation time as much of the computation load of simulating photon transport through the object is done only once for all projection angles. The results of the proposed MP-CFD method agrees well with the experimental data in measurements of point spread function (PSF), producing a correlation coefficient (r2) of 0.99 compared to experimental data. The speed of MP-CFD is shown to be about 60 times faster than a regular forced detection MC program with similar results.
Environmental Influences on Diel Calling Behavior in Baleen Whales
2013-09-30
to allow known calls (e.g., right whale upcall and gunshot, fin whale 20- Hz pulses, humpback whale downsweeps, sei whale low-frequency downsweeps...fin, humpback , sei, and North Atlantic right whales . Real-time detections were evaluated after recovery of the gliders by (1) comparing the acoustic...from both an aircraft and ship. The overall false detection rate for individual calls was 14%, and for right, humpback , and fin whales , false
Andergassen, Ulrich; Kölbl, Alexandra C; Mahner, Sven; Jeschke, Udo
2016-04-01
Cells, which detach from a primary epithelial tumour and migrate through lymphatic vessels and blood stream are called 'circulating tumour cells'. These cells are considered to be the main root of remote metastasis and are correlated to a worse prognosis concerning progression-free and overall survival of the patients. Therefore, the detection of the minimal residual disease is of great importance regarding therapeutic decisions. Many different detection strategies are already available, but only one method, the CellSearch® system, reached FDA approval. The present review focusses on the detection of circulating tumour cells by means of real-time PCR, a highly sensitive method based on differences in gene expression between normal and malignant cells. Strategies for an enrichment of tumour cells are mentioned, as well as a large panel of potential marker genes. Drawbacks and advantages of the technique are elucidated, whereas, the greatest advantage might be, that by selection of appropriate marker genes, also tumour cells, which have already undergone epithelial to mesenchymal transition can be detected. Finally, the application of real-time PCR in different gynaecological malignancies is described, with breast cancer being the most studied cancer entity.
NASA Astrophysics Data System (ADS)
Baziw, Erick; Verbeek, Gerald
2012-12-01
Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.
Yegnasubramanian, Srinivasan; Lin, Xiaohui; Haffner, Michael C; DeMarzo, Angelo M; Nelson, William G
2006-02-09
Hypermethylation of CpG island (CGI) sequences is a nearly universal somatic genome alteration in cancer. Rapid and sensitive detection of DNA hypermethylation would aid in cancer diagnosis and risk stratification. We present a novel technique, called COMPARE-MS, that can rapidly and quantitatively detect CGI hypermethylation with high sensitivity and specificity in hundreds of samples simultaneously. To quantitate CGI hypermethylation, COMPARE-MS uses real-time PCR of DNA that was first digested by methylation-sensitive restriction enzymes and then precipitated by methyl-binding domain polypeptides immobilized on a magnetic solid matrix. We show that COMPARE-MS could detect five genome equivalents of methylated CGIs in a 1000- to 10,000-fold excess of unmethylated DNA. COMPARE-MS was used to rapidly quantitate hypermethylation at multiple CGIs in >155 prostate tissues, including benign and malignant prostate specimens, and prostate cell lines. This analysis showed that GSTP1, MDR1 and PTGS2 CGI hypermethylation as determined by COMPARE-MS could differentiate between malignant and benign prostate with sensitivities >95% and specificities approaching 100%. This novel technology could significantly improve our ability to detect CGI hypermethylation.
Scarano, Simona; Ermini, Maria Laura; Spiriti, Maria Michela; Mascini, Marco; Bogani, Patrizia; Minunni, Maria
2011-08-15
Surface plasmon resonance imaging (SPRi) was used as the transduction principle for the development of optical-based sensing for transgenes detection in human cell lines. The objective was to develop a multianalyte, label-free, and real-time approach for DNA sequences that are identified as markers of transgenosis events. The strategy exploits SPRi sensing to detect the transgenic event by targeting selected marker sequences, which are present on shuttle vector backbone used to carry out the transfection of human embryonic kidney (HEK) cell lines. Here, we identified DNA sequences belonging to the Cytomegalovirus promoter and the Enhanced Green Fluorescent Protein gene. System development is discussed in terms of probe efficiency and influence of secondary structures on biorecognition reaction on sensor; moreover, optimization of PCR samples pretreatment was carried out to allow hybridization on biosensor, together with an approach to increase SPRi signals by in situ mass enhancement. Real-time PCR was also employed as reference technique for marker sequences detection on human HEK cells. We can foresee that the developed system may have potential applications in the field of antidoping research focused on the so-called gene doping.
Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort.
Gambin, Tomasz; Akdemir, Zeynep C; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M B; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M; Eldomery, Mohammad K; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W; Boerwinkle, Eric; Beaudet, Arthur L; Gibbs, Richard A; Lupski, James R
2017-02-28
We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor-Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17-50% of pathogenic CNVs in different disease cohorts where 7.1-11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
VarDict: a novel and versatile variant caller for next-generation sequencing in cancer research
Lai, Zhongwu; Markovets, Aleksandra; Ahdesmaki, Miika; Chapman, Brad; Hofmann, Oliver; McEwen, Robert; Johnson, Justin; Dougherty, Brian; Barrett, J. Carl; Dry, Jonathan R.
2016-01-01
Abstract Accurate variant calling in next generation sequencing (NGS) is critical to understand cancer genomes better. Here we present VarDict, a novel and versatile variant caller for both DNA- and RNA-sequencing data. VarDict simultaneously calls SNV, MNV, InDels, complex and structural variants, expanding the detected genetic driver landscape of tumors. It performs local realignments on the fly for more accurate allele frequency estimation. VarDict performance scales linearly to sequencing depth, enabling ultra-deep sequencing used to explore tumor evolution or detect tumor DNA circulating in blood. In addition, VarDict performs amplicon aware variant calling for polymerase chain reaction (PCR)-based targeted sequencing often used in diagnostic settings, and is able to detect PCR artifacts. Finally, VarDict also detects differences in somatic and loss of heterozygosity variants between paired samples. VarDict reprocessing of The Cancer Genome Atlas (TCGA) Lung Adenocarcinoma dataset called known driver mutations in KRAS, EGFR, BRAF, PIK3CA and MET in 16% more patients than previously published variant calls. We believe VarDict will greatly facilitate application of NGS in clinical cancer research. PMID:27060149
Zou, Zhengxia; Shi, Zhenwei
2018-03-01
We propose a new paradigm for target detection in high resolution aerial remote sensing images under small target priors. Previous remote sensing target detection methods frame the detection as learning of detection model + inference of class-label and bounding-box coordinates. Instead, we formulate it from a Bayesian view that at inference stage, the detection model is adaptively updated to maximize its posterior that is determined by both training and observation. We call this paradigm "random access memories (RAM)." In this paradigm, "Memories" can be interpreted as any model distribution learned from training data and "random access" means accessing memories and randomly adjusting the model at detection phase to obtain better adaptivity to any unseen distribution of test data. By leveraging some latest detection techniques e.g., deep Convolutional Neural Networks and multi-scale anchors, experimental results on a public remote sensing target detection data set show our method outperforms several other state of the art methods. We also introduce a new data set "LEarning, VIsion and Remote sensing laboratory (LEVIR)", which is one order of magnitude larger than other data sets of this field. LEVIR consists of a large set of Google Earth images, with over 22 k images and 10 k independently labeled targets. RAM gives noticeable upgrade of accuracy (an mean average precision improvement of 1% ~ 4%) of our baseline detectors with acceptable computational overhead.
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.
Structural health management of aerospace hotspots under fatigue loading
NASA Astrophysics Data System (ADS)
Soni, Sunilkumar
Sustainability and life-cycle assessments of aerospace systems, such as aircraft structures and propulsion systems, represent growing challenges in engineering. Hence, there has been an increasing demand in using structural health monitoring (SHM) techniques for continuous monitoring of these systems in an effort to improve safety and reduce maintenance costs. The current research is part of an ongoing multidisciplinary effort to develop a robust SHM framework resulting in improved models for damage-state awareness and life prediction, and enhancing capability of future aircraft systems. Lug joints, a typical structural hotspot, were chosen as the test article for the current study. The thesis focuses on integrated SHM techniques for damage detection and characterization in lug joints. Piezoelectric wafer sensors (PZTs) are used to generate guided Lamb waves as they can be easily used for onboard applications. Sensor placement in certain regions of a structural component is not feasible due to the inaccessibility of the area to be monitored. Therefore, a virtual sensing concept is introduced to acquire sensor data from finite element (FE) models. A full three dimensional FE analysis of lug joints with piezoelectric transducers, accounting for piezoelectrical-mechanical coupling, was performed in Abaqus and the sensor signals were simulated. These modeled sensors are called virtual sensors. A combination of real data from PZTs and virtual sensing data from FE analysis is used to monitor and detect fatigue damage in aluminum lug joints. Experiments were conducted on lug joints under fatigue loads and sensor signals collected were used to validate the simulated sensor response. An optimal sensor placement methodology for lug joints is developed based on a detection theory framework to maximize the detection rate and minimize the false alarm rate. The placement technique is such that the sensor features can be directly correlated to damage. The technique accounts for a number of factors, such as actuation frequency and strength, minimum damage size, damage detection scheme, material damping, signal to noise ratio and sensing radius. Advanced information processing methodologies are discussed for damage diagnosis. A new, instantaneous approach for damage detection, localization and quantification is proposed for applications to practical problems associated with changes in reference states under different environmental and operational conditions. Such an approach improves feature extraction for state awareness, resulting in robust life prediction capabilities.
Mana Kialengila, Didi; Wolfs, Kris; Bugalama, John; Van Schepdael, Ann; Adams, Erwin
2013-11-08
Determination of volatile organic components (VOC's) is often done by static headspace gas chromatography as this technique is very robust and combines easy sample preparation with good selectivity and low detection limits. This technique is used nowadays in different applications which have in common that they have a dirty matrix which would be problematic in direct injection approaches. Headspace by nature favors the most volatile compounds, avoiding the less volatile to reach the injector and column. As a consequence, determination of a high boiling solvent in a lower boiling matrix becomes challenging. Determination of VOCs like: xylenes, cumene, N,N-dimethylformamide (DMF), dimethyl sulfoxide (DMSO), N,N-dimethylacetamide (DMA), N-methyl-2-pyrrolidone (NMP), 1,3-dimethyl-2-imidazolidinone (DMI), benzyl alcohol (BA) and anisole in water or water soluble products are an interesting example of the arising problems. In this work, a headspace variant called full evaporation technique is worked out and validated for the mentioned solvents. Detection limits below 0.1 μg/vial are reached with RSD values below 10%. Mean recovery values ranged from 92.5 to 110%. The optimized method was applied to determine residual DMSO in a water based cell culture and DMSO and DMA in tetracycline hydrochloride (a water soluble sample). Copyright © 2013 Elsevier B.V. All rights reserved.
Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M
1988-09-01
1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.
NASA Astrophysics Data System (ADS)
Marks, Fay A.; Tomlinson, Harold W.; Brooksby, Glen W.
1993-09-01
A new technique called Ultrasound Tagging of Light (UTL) for imaging breast tissue is described. In this approach, photon localization in turbid tissue is achieved by cross- modulating a laser beam with focussed, pulsed ultrasound. Light which passes through the ultrasound focal spot is `tagged' with the frequency of the ultrasound pulse. The experimental system uses an Argon-Ion laser, a single PIN photodetector, and a 1 MHz fixed-focus pulsed ultrasound transducer. The utility of UTL as a photon localization technique in scattering media is examined using tissue phantoms consisting of gelatin and intralipid. In a separate study, in vivo optical reflectance spectrophotometry was performed on human breast tumors implanted intramuscularly and subcutaneously in nineteen nude mice. The validity of applying a quadruple wavelength breast cancer discrimination metric (developed using breast biopsy specimens) to the in vivo condition was tested. A scatter diagram for the in vivo model tumors based on this metric is presented using as the `normal' controls the hands and fingers of volunteers. Tumors at different growth stages were studied; these tumors ranged in size from a few millimeters to two centimeters. It is expected that when coupled with a suitable photon localization technique like UTL, spectral discrimination methods like this one will prove useful in the detection of breast cancer by non-ionizing means.
Yan, Yuling; Petchprayoon, Chutima; Mao, Shu; Marriott, Gerard
2013-01-01
Optical switch probes undergo rapid and reversible transitions between two distinct states, one of which may fluoresce. This class of probe is used in various super-resolution imaging techniques and in the high-contrast imaging technique of optical lock-in detection (OLID) microscopy. Here, we introduce optimized optical switches for studies in living cells under standard conditions of cell culture. In particular, a highly fluorescent cyanine probe (Cy or Cy3) is directly or indirectly linked to naphthoxazine (NISO), a highly efficient optical switch that undergoes robust, 405/532 nm-driven transitions between a colourless spiro (SP) state and a colourful merocyanine (MC) state. The intensity of Cy fluorescence in these Cy/Cy3-NISO probes is reversibly modulated between a low and high value in SP and MC states, respectively, as a result of Förster resonance energy transfer. Cy/Cy3-NISO probes are targeted to specific proteins in living cells where defined waveforms of Cy3 fluorescence are generated by optical switching of the SP and MC states. Finally, we introduce a new imaging technique (called OLID-immunofluorescence microscopy) that combines optical modulation of Cy3 fluorescence from Cy3/NISO co-labelled antibodies within fixed cells and OLID analysis to significantly improve image contrast in samples having high background or rare antigens. PMID:23267183
Speech enhancement using the modified phase-opponency model.
Deshmukh, Om D; Espy-Wilson, Carol Y; Carney, Laurel H
2007-06-01
In this paper we present a model called the Modified Phase-Opponency (MPO) model for single-channel speech enhancement when the speech is corrupted by additive noise. The MPO model is based on the auditory PO model, proposed for detection of tones in noise. The PO model includes a physiologically realistic mechanism for processing the information in neural discharge times and exploits the frequency-dependent phase properties of the tuned filters in the auditory periphery by using a cross-auditory-nerve-fiber coincidence detection for extracting temporal cues. The MPO model alters the components of the PO model such that the basic functionality of the PO model is maintained but the properties of the model can be analyzed and modified independently. The MPO-based speech enhancement scheme does not need to estimate the noise characteristics nor does it assume that the noise satisfies any statistical model. The MPO technique leads to the lowest value of the LPC-based objective measures and the highest value of the perceptual evaluation of speech quality measure compared to other methods when the speech signals are corrupted by fluctuating noise. Combining the MPO speech enhancement technique with our aperiodicity, periodicity, and pitch detector further improves its performance.
Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.
Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir
2017-01-01
Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Huang, Pin-Chieh; Pande, Paritosh; Ahmad, Adeel; Marjanovic, Marina; Spillman, Darold R.; Odintsov, Boris; Boppart, Stephen A.
2016-01-01
Magnetic nanoparticles (MNPs) have been used in many diagnostic and therapeutic biomedical applications over the past few decades to enhance imaging contrast, steer drugs to targets, and treat tumors via hyperthermia. Optical coherence tomography (OCT) is an optical biomedical imaging modality that relies on the detection of backscattered light to generate high-resolution cross-sectional images of biological tissue. MNPs have been utilized as imaging contrast and perturbative mechanical agents in OCT in techniques called magnetomotive OCT (MM-OCT) and magnetomotive elastography (MM-OCE), respectively. MNPs have also been independently used for magnetic hyperthermia treatments, enabling therapeutic functions such as killing tumor cells. It is well known that the localized tissue heating during hyperthermia treatments result in a change in the biomechanical properties of the tissue. Therefore, we propose a novel dosimetric technique for hyperthermia treatment based on the viscoelasticity change detected by MM-OCE, further enabling the theranostic function of MNPs. In this paper, we first review the basic principles and applications of MM-OCT, MM-OCE, and magnetic hyperthermia, and present new preliminary results supporting the concept of MM-OCE-based hyperthermia dosimetry. PMID:28163565
Klemm, Matthias; Blum, Johannes; Link, Dietmar; Hammer, Martin; Haueisen, Jens; Schweitzer, Dietrich
2016-09-01
Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique to detect changes in the human retina. The autofluorescence decay over time, generated by endogenous fluorophores, is measured in vivo. The strong autofluorescence of the crystalline lens, however, superimposes the intensity decay of the retina fluorescence, as the confocal principle is not able to suppress it sufficiently. Thus, the crystalline lens autofluorescence causes artifacts in the retinal fluorescence lifetimes determined from the intensity decays. Here, we present a new technique to suppress the autofluorescence of the crystalline lens by introducing an annular stop into the detection light path, which we call Schweitzer's principle. The efficacy of annular stops with an outer diameter of 7 mm and inner diameters of 1 to 5 mm are analyzed in an experimental setup using a model eye based on fluorescent dyes. Compared to the confocal principle, Schweitzer's principle with an inner diameter of 3 mm is able to reduce the simulated crystalline lens fluorescence to 4%, while 42% of the simulated retina fluorescence is preserved. Thus, we recommend the implementation of Schweitzer's principle in scanning laser ophthalmoscopes used for fundus autofluorescence measurements, especially the FLIO device, for improved image quality.
Some attributes of a language for property-based testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neagoe, Vicentiu; Bishop, Matt
Property-based testing is a testing technique that evaluates executions of a program. The method checks that specifications, called properties, hold throughout the execution of the program. TASpec is a language used to specify these properties. This paper compares some attributes of the language with the specification patterns used for model-checking languages, and then presents some descriptions of properties that can be used to detect common security flaws in programs. This report describes the results of a one year research project at the University of California, Davis, which was funded by a University Collaboration LDRD entitled ''Property-based Testing for Cyber Securitymore » Assurance''.« less
Biochips: A fruitful product of solid state physics and molecular biology
NASA Astrophysics Data System (ADS)
Mendoza-Alvarez, Julio G.
1998-08-01
The application of the standard high resolution photolithography techniques used in the semiconductor device industry to the growth of a chain of nucleotides with a precise and well known sequence, has made possible the fabrication of a new kind of device, the so called biochips. At the National Polytechnic Institute in Mexico we have joined a multidisciplinary scientific group, and we are in the process of developing the technical capabilities in order to set up a processing lab to fabricate biochips focused to very specific applications in the area of cancer detection. We present here the main lines along which this project is being developed.
The robot's eyes - Stereo vision system for automated scene analysis
NASA Technical Reports Server (NTRS)
Williams, D. S.
1977-01-01
Attention is given to the robot stereo vision system which maintains the image produced by solid-state detector television cameras in a dynamic random access memory called RAPID. The imaging hardware consists of sensors (two solid-state image arrays using a charge injection technique), a video-rate analog-to-digital converter, the RAPID memory, and various types of computer-controlled displays, and preprocessing equipment (for reflexive actions, processing aids, and object detection). The software is aimed at locating objects and transversibility. An object-tracking algorithm is discussed and it is noted that tracking speed is in the 50-75 pixels/s range.
Categorisation of full waveform data provided by laser scanning devices
NASA Astrophysics Data System (ADS)
Ullrich, Andreas; Pfennigbauer, Martin
2011-11-01
In 2004, a laser scanner device for commercial airborne laser scanning applications, the RIEGL LMS-Q560, was introduced to the market, making use of a radical alternative approach to the traditional analogue signal detection and processing schemes found in LIDAR instruments so far: digitizing the echo signals received by the instrument for every laser pulse and analysing these echo signals off-line in a so-called full waveform analysis in order to retrieve almost all information contained in the echo signal using transparent algorithms adaptable to specific applications. In the field of laser scanning the somewhat unspecific term "full waveform data" has since been established. We attempt a categorisation of the different types of the full waveform data found in the market. We discuss the challenges in echo digitization and waveform analysis from an instrument designer's point of view and we will address the benefits to be gained by using this technique, especially with respect to the so-called multi-target capability of pulsed time-of-flight LIDAR instruments.
NASA Astrophysics Data System (ADS)
Christensen-Dalsgaard, Jakob
Anuran amphibians (frogs and toads) of most of the 3,500 species that exist today are highly vocal animals. In most frogs, males will spend considerable energy on calling and incur sizeable predation risks and the females’ detection and localization of the calls of conspecific males is often a prerequisite for successful mating. Therefore, acoustic communication is evidently evolutionarily important in the anurans, and their auditory system is probably shaped by the selective pressures associated with production, detection and localization of the communication calls.
Leroy, Emmanuelle C; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves
2016-01-01
Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology.
Leroy, Emmanuelle C.; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves
2016-01-01
Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology. PMID:27828976
Robust detection, isolation and accommodation for sensor failures
NASA Technical Reports Server (NTRS)
Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.
1986-01-01
The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques
RNA-FISH to Study Regulatory RNA at the Site of Transcription.
Soler, Marta; Boque-Sastre, Raquel; Guil, Sonia
2017-01-01
The increasing role of all types of regulatory RNAs in the orchestration of cellular programs has enhanced the development of a variety of techniques that allow its precise detection, quantification, and functional scrutiny. Recent advances in imaging and fluoresecent in situ hybridization (FISH) methods have enabled the utilization of user-friendly protocols that provide highly sensitive and accurate detection of ribonucleic acid molecules at both the single cell and subcellular levels. We herein describe the approach originally developed by Stellaris ® , in which the target RNA molecule is fluoresecently labeled with multiple tiled complementary probes each carrying a fluorophore, thus improving sensitivity and reducing the chance of false positives. We have applied this method to the detection of nascent RNAs that partake of special regulatory structures called R loops. Their growing role in active gene expression regulation (Aguilera and Garcia-Muse, Mol Cell 46:115-124, 2012; Ginno et al., Mol Cell 45:814-825, 2012; Sun et al., Science 340:619-621, 2013; Bhatia et al., Nature 511:362-365, 2014) imposes the use of a combination of in vivo and in vitro techniques for the detailed analysis of the transcripts involved. Therefore, their study is a good example to illustrate how RNA FISH, combined with transcriptional arrest and/or cell synchronization, permits localization and temporal characterization of potentially regulatory RNA sequences.
Optical biopsy fiber-based fluorescence spectroscopy instrumentation
NASA Astrophysics Data System (ADS)
Katz, Alvin; Ganesan, Singaravelu; Yang, Yuanlong; Tang, Gui C.; Budansky, Yury; Celmer, Edward J.; Savage, Howard E.; Schantz, Stimson P.; Alfano, Robert R.
1996-04-01
Native fluorescence spectroscopy of biomolecules has emerged as a new modality to the medical community in characterizing the various physiological conditions of tissues. In the past several years, many groups have been working to introduce the spectroscopic methods to diagnose cancer. Researchers have successfully used native fluorescence to distinguish cancerous from normal tissue samples in rat and human tissue. We have developed three generations of instruments, called the CD-scan, CD-ratiometer and CD-map, to allow the medical community to use optics for diagnosing tissue. Using ultraviolet excitation and emission spectral measurements on both normal and cancerous tissue of the breast, gynecology, colon, and aerodigestive tract can be separated. For example, from emission intensities at 340 nm to 440 nm (300 nm excitation), a statistically consistent difference between malignant tissue and normal or benign tissue is observed. In order to utilize optical biopsy techniques in a clinical setting, the CD-scan instrument was developed, which allows for rapid and reliable in-vitro and in-vivo florescence measurements of the aerodigestive tract with high accuracy. The instrumentation employs high sensitivity detection techniques which allows for lamp excitation, small diameter optical fiber probes; the higher spatial resolution afforded by the small diameter probes can increase the ability to detect smaller tumors. The fiber optic probes allow for usage in the aerodigestive tract, cervix and colon. Needle based fiber probes have been developed for in-vivo detection of breast cancer.
[Utilization of Big Data in Medicine and Future Outlook].
Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya
2016-03-01
"Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273
Nanoscale chemical imaging by photoinduced force microscopy
Nowak, Derek; Morrison, William; Wickramasinghe, H. Kumar; Jahng, Junghoon; Potma, Eric; Wan, Lei; Ruiz, Ricardo; Albrecht, Thomas R.; Schmidt, Kristin; Frommer, Jane; Sanders, Daniel P.; Park, Sung
2016-01-01
Correlating spatial chemical information with the morphology of closely packed nanostructures remains a challenge for the scientific community. For example, supramolecular self-assembly, which provides a powerful and low-cost way to create nanoscale patterns and engineered nanostructures, is not easily interrogated in real space via existing nondestructive techniques based on optics or electrons. A novel scanning probe technique called infrared photoinduced force microscopy (IR PiFM) directly measures the photoinduced polarizability of the sample in the near field by detecting the time-integrated force between the tip and the sample. By imaging at multiple IR wavelengths corresponding to absorption peaks of different chemical species, PiFM has demonstrated the ability to spatially map nm-scale patterns of the individual chemical components of two different types of self-assembled block copolymer films. With chemical-specific nanometer-scale imaging, PiFM provides a powerful new analytical method for deepening our understanding of nanomaterials. PMID:27051870
Web Navigation Sequences Automation in Modern Websites
NASA Astrophysics Data System (ADS)
Montoto, Paula; Pan, Alberto; Raposo, Juan; Bellas, Fernando; López, Javier
Most today’s web sources are designed to be used by humans, but they do not provide suitable interfaces for software programs. That is why a growing interest has arisen in so-called web automation applications that are widely used for different purposes such as B2B integration, automated testing of web applications or technology and business watch. Previous proposals assume models for generating and reproducing navigation sequences that are not able to correctly deal with new websites using technologies such as AJAX: on one hand existing systems only allow recording simple navigation actions and, on the other hand, they are unable to detect the end of the effects caused by an user action. In this paper, we propose a set of new techniques to record and execute web navigation sequences able to deal with all the complexity existing in AJAX-based web sites. We also present an exhaustive evaluation of the proposed techniques that shows very promising results.
Lensless Photoluminescence Hyperspectral Camera Employing Random Speckle Patterns.
Žídek, Karel; Denk, Ondřej; Hlubuček, Jiří
2017-11-10
We propose and demonstrate a spectrally-resolved photoluminescence imaging setup based on the so-called single pixel camera - a technique of compressive sensing, which enables imaging by using a single-pixel photodetector. The method relies on encoding an image by a series of random patterns. In our approach, the image encoding was maintained via laser speckle patterns generated by an excitation laser beam scattered on a diffusor. By using a spectrometer as the single-pixel detector we attained a realization of a spectrally-resolved photoluminescence camera with unmatched simplicity. We present reconstructed hyperspectral images of several model scenes. We also discuss parameters affecting the imaging quality, such as the correlation degree of speckle patterns, pattern fineness, and number of datapoints. Finally, we compare the presented technique to hyperspectral imaging using sample scanning. The presented method enables photoluminescence imaging for a broad range of coherent excitation sources and detection spectral areas.
NASA Astrophysics Data System (ADS)
Acquisti, Claudia; Allegrini, Paolo; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi
2004-04-01
We investigate on a possible way to connect the presence of Low-Complexity Sequences (LCS) in DNA genomes and the nonstationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called Non-Stationarity Entropic Index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation.
Nie, Min; Ren, Jie; Li, Zhengjun; Niu, Jinhai; Qiu, Yihong; Zhu, Yisheng; Tong, Shanbao
2009-01-01
Without visual information, the blind people live in various hardships with shopping, reading, finding objects and etc. Therefore, we developed a portable auditory guide system, called SoundView, for visually impaired people. This prototype system consists of a mini-CCD camera, a digital signal processing unit and an earphone, working with built-in customizable auditory coding algorithms. Employing environment understanding techniques, SoundView processes the images from a camera and detects objects tagged with barcodes. The recognized objects in the environment are then encoded into stereo speech signals for the blind though an earphone. The user would be able to recognize the type, motion state and location of the interested objects with the help of SoundView. Compared with other visual assistant techniques, SoundView is object-oriented and has the advantages of cheap cost, smaller size, light weight, low power consumption and easy customization.
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.
The MOF+ Technique: A Significant Synergic Effect Enables High Performance Chromate Removal.
Luo, Ming Biao; Xiong, Yang Yang; Wu, Hui Qiong; Feng, Xue Feng; Li, Jian Qiang; Luo, Feng
2017-12-18
A significant synergic effect between a metal-organic framework (MOF) and Fe 2 SO 4 , the so-called MOF + technique, is exploited for the first time to remove toxic chromate from aqueous solutions. The results show that relative to the pristine MOF samples (no detectable chromate removal), the MOF + method enables super performance, giving a 796 Cr mg g -1 adsorption capacity. The value is almost eight-fold higher than the best value of established MOF adsorbents, and the highest value of all reported porous adsorbents for such use. The adsorption mechanism, unlike the anion-exchange process that dominates chromate removal in all other MOF adsorbents, as unveiled by X-ray photoelectron spectroscopy (XPS), scanning electron microscopy (SEM), and transmission electron microscopy (TEM), is due to the surface formation of Fe 0.75 Cr 0.25 (OH) 3 nanospheres on the MOF samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bui, Thuy-Vy D.; Takekawa, John Y.; Overton, Cory T.; Schultz, Emily R.; Hull, Joshua M.; Casazza, Michael L.
2015-01-01
The California Ridgway's rail Rallus obsoletus obsoletus (hereafter California rail) is a secretive marsh bird endemic to tidal marshes in the San Francisco Bay (hereafter bay) of California. The California rail has undergone significant range contraction and population declines due to a variety of factors, including predation and the degradation and loss of habitat. Call-count surveys, which include call playbacks, based on the standardized North American marsh bird monitoring protocol have been conducted throughout the bay since 2005 to monitor population size and distribution of the California rail. However, call-count surveys are difficult to evaluate for efficacy or accuracy. To measure the accuracy of call-count surveys and investigate whether radio-marked California rails moved in response to call-count surveys, we compared locations of radio-marked California rails collected at frequent intervals (15 min) to California rail detections recorded during call-count surveys conducted over the same time periods. Overall, 60% of radio-marked California rails within 200 m of observers were not detected during call-count surveys. Movements of radio-marked California rails showed no directional bias (P = 0.92) irrespective of whether or not playbacks of five marsh bird species (including the California rail) were broadcast from listening stations. Our findings suggest that playbacks of rail vocalizations do not consistently influence California rail movements during surveys. However, call-count surveys may underestimate California rail presence; therefore, caution should be used when relating raw numbers of call-count detections to population abundance.
Schmidtke, Daniel; Schulz, Jochen; Hartung, Jörg; Esser, Karl-Heinz
2013-01-01
In the 1970s, Tavolga conducted a series of experiments in which he found behavioral evidence that the vocalizations of the catfish species Ariopsis felis may play a role in a coarse form of echolocation. Based on his findings, he postulated a similar function for the calls of closely related catfish species. Here, we describe the physical characteristics of the predominant call-type of Ariopsis seemanni. In two behavioral experiments, we further explore whether A. seemanni uses these calls for acoustic obstacle detection by testing the hypothesis that the call-emission rate of individual fish should increase when subjects are confronted with novel objects, as it is known from other vertebrate species that use pulse-type signals to actively probe the environment. Audio-video monitoring of the fish under different obstacle conditions did not reveal a systematic increase in the number of emitted calls in the presence of novel objects or in dependence on the proximity between individual fish and different objects. These negative findings in combination with our current understanding of directional hearing in fishes (which is a prerequisite for acoustic obstacle detection) make it highly unlikely that A. seemanni uses its calls for acoustic obstacle detection. We argue that the calls are more likely to play a role in intra- or interspecific communication (e.g. in school formation or predator deterrence) and present results from a preliminary Y-maze experiment that are indicative for a positive phonotaxis of A. seemanni towards the calls of conspecifics. PMID:23741408
Evaluation of listener-based anuran surveys with automated audio recording devices
Shearin, A. F.; Calhoun, A.J.K.; Loftin, C.S.
2012-01-01
Volunteer-based audio surveys are used to document long-term trends in anuran community composition and abundance. Current sampling protocols, however, are not region- or species-specific and may not detect relatively rare or audibly cryptic species. We used automated audio recording devices to record calling anurans during 2006–2009 at wetlands in Maine, USA. We identified species calling, chorus intensity, time of day, and environmental variables when each species was calling and developed logistic and generalized mixed models to determine the time interval and environmental variables that optimize detection of each species during peak calling periods. We detected eight of nine anurans documented in Maine. Individual recordings selected from the sampling period (0.5 h past sunset to 0100 h) described in the North American Amphibian Monitoring Program (NAAMP) detected fewer species than were detected in recordings from 30 min past sunset until sunrise. Time of maximum detection of presence and full chorusing for three species (green frogs, mink frogs, pickerel frogs) occurred after the NAAMP sampling end time (0100 h). The NAAMP protocol’s sampling period may result in omissions and misclassifications of chorus sizes for certain species. These potential errors should be considered when interpreting trends generated from standardized anuran audio surveys.
Optimization technique for problems with an inequality constraint
NASA Technical Reports Server (NTRS)
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
Use of a Parasitic Wasp as a Biosensor
Olson, Dawn; Rains, Glen
2014-01-01
Screening cargo for illicit substances is in need of rapid high-throughput inspection systems that accurately identify suspicious cargo. Here we investigate the ability of a parasitic wasp, Microplitis croceipes to detect and respond to methyl benzoate, the volatile component of cocaine, by examining their response to training concentrations, their sensitivity at low concentrations, and their ability to detect methyl benzoate when two concealment substances (green tea and ground coffee) are added to the testing arena. Utilizing classical associative learning techniques with sucrose as reward, we found that M. croceipes learns individual concentrations of methyl benzoate, and they can generalize this learning to concentrations 100× lower than the training concentration. Their sensitivity to methyl benzoate is very low at an estimated 3 ppb. They are also able to detect methyl benzoate when covered completely by green tea, but were not able to detect methyl benzoate when covered completely by coffee grounds. Habituation to the tea and coffee odors prior to testing improves their responses, resulting in effective detection of methyl benzoate covered by the coffee grounds. With the aid of the portable device called ‘the wasp hound’, the wasps appear to have potential to be effective on-site biosensors for the detection of cocaine. PMID:25587415
Towards human behavior recognition based on spatio temporal features and support vector machines
NASA Astrophysics Data System (ADS)
Ghabri, Sawsen; Ouarda, Wael; Alimi, Adel M.
2017-03-01
Security and surveillance are vital issues in today's world. The recent acts of terrorism have highlighted the urgent need for efficient surveillance. There is indeed a need for an automated system for video surveillance which can detect identity and activity of person. In this article, we propose a new paradigm to recognize an aggressive human behavior such as boxing action. Our proposed system for human activity detection includes the use of a fusion between Spatio Temporal Interest Point (STIP) and Histogram of Oriented Gradient (HoG) features. The novel feature called Spatio Temporal Histogram Oriented Gradient (STHOG). To evaluate the robustness of our proposed paradigm with a local application of HoG technique on STIP points, we made experiments on KTH human action dataset based on Multi Class Support Vector Machines classification. The proposed scheme outperforms basic descriptors like HoG and STIP to achieve 82.26% us an accuracy value of classification rate.
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Alkaline Comet Assay for Assessing DNA Damage in Individual Cells.
Pu, Xinzhu; Wang, Zemin; Klaunig, James E
2015-08-06
Single-cell gel electrophoresis, commonly called a comet assay, is a simple and sensitive method for assessing DNA damage at the single-cell level. It is an important technique in genetic toxicological studies. The comet assay performed under alkaline conditions (pH >13) is considered the optimal version for identifying agents with genotoxic activity. The alkaline comet assay is capable of detecting DNA double-strand breaks, single-strand breaks, alkali-labile sites, DNA-DNA/DNA-protein cross-linking, and incomplete excision repair sites. The inclusion of digestion of lesion-specific DNA repair enzymes in the procedure allows the detection of various DNA base alterations, such as oxidative base damage. This unit describes alkaline comet assay procedures for assessing DNA strand breaks and oxidative base alterations. These methods can be applied in a variety of cells from in vitro and in vivo experiments, as well as human studies. Copyright © 2015 John Wiley & Sons, Inc.
Listening to sounds from an exploding meteor and oceanic waves
NASA Astrophysics Data System (ADS)
Evers, L. G.; Haak, H. W.
Low frequency sound (infrasound) measurements have been selected within the Comprehensive Nuclear-Test-Ban Treaty (CTBT) as a technique to detect and identify possible nuclear explosions. The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) operates since 1999 an experimental infrasound array of 16 micro-barometers. Here we show the rare detection and identification of an exploding meteor above Northern Germany on November 8th, 1999 with data from the Deelen Infrasound Array (DIA). At the same time, sound was radiated from the Atlantic Ocean, South of Iceland, due to the atmospheric coupling of standing ocean waves, called microbaroms. Occurring with only 0.04 Hz difference in dominant frequency, DIA proved to be able to discriminate between the physically different sources of infrasound through its unique lay-out and instruments. The explosive power of the meteor being 1.5 kT TNT is in the range of nuclear explosions and therefore relevant to the CTBT.
Arnau, Antonio
2008-01-01
From the first applications of AT-cut quartz crystals as sensors in solutions more than 20 years ago, the so-called quartz crystal microbalance (QCM) sensor is becoming into a good alternative analytical method in a great deal of applications such as biosensors, analysis of biomolecular interactions, study of bacterial adhesion at specific interfaces, pathogen and microorganism detection, study of polymer film-biomolecule or cell-substrate interactions, immunosensors and an extensive use in fluids and polymer characterization and electrochemical applications among others. The appropriate evaluation of this analytical method requires recognizing the different steps involved and to be conscious of their importance and limitations. The first step involved in a QCM system is the accurate and appropriate characterization of the sensor in relation to the specific application. The use of the piezoelectric sensor in contact with solutions strongly affects its behavior and appropriate electronic interfaces must be used for an adequate sensor characterization. Systems based on different principles and techniques have been implemented during the last 25 years. The interface selection for the specific application is important and its limitations must be known to be conscious of its suitability, and for avoiding the possible error propagation in the interpretation of results. This article presents a comprehensive overview of the different techniques used for AT-cut quartz crystal microbalance in in-solution applications, which are based on the following principles: network or impedance analyzers, decay methods, oscillators and lock-in techniques. The electronic interfaces based on oscillators and phase-locked techniques are treated in detail, with the description of different configurations, since these techniques are the most used in applications for detection of analytes in solutions, and in those where a fast sensor response is necessary. PMID:27879713
Crompot, Emerence; Van Damme, Michael; Duvillier, Hugues; Pieters, Karlien; Vermeesch, Marjorie; Perez-Morga, David; Meuleman, Nathalie; Mineur, Philippe; Bron, Dominique; Lagneaux, Laurence; Stamatopoulos, Basile
2015-01-01
Microparticles (MPs), also called microvesicles (MVs) are plasma membrane-derived fragments with sizes ranging from 0.1 to 1μm. Characterization of these MPs is often performed by flow cytometry but there is no consensus on the appropriate negative control to use that can lead to false positive results. We analyzed MPs from platelets, B-cells, T-cells, NK-cells, monocytes, and chronic lymphocytic leukemia (CLL) B-cells. Cells were purified by positive magnetic-separation and cultured for 48h. Cells and MPs were characterized using the following monoclonal antibodies (CD19,20 for B-cells, CD3,8,5,27 for T-cells, CD16,56 for NK-cells, CD14,11c for monocytes, CD41,61 for platelets). Isolated MPs were stained with annexin-V-FITC and gated between 300nm and 900nm. The latex bead technique was then performed for easy detection of MPs. Samples were analyzed by Transmission (TEM) and Scanning Electron microscopy (SEM). Annexin-V positive events within a gate of 300-900nm were detected and defined as MPs. Our results confirmed that the characteristic antigens CD41/CD61 were found on platelet-derived-MPs validating our technique. However, for MPs derived from other cell types, we were unable to detect any antigen, although they were clearly expressed on the MP-producing cells in the contrary of several data published in the literature. Using the latex bead technique, we confirmed detection of CD41,61. However, the apparent expression of other antigens (already deemed positive in several studies) was determined to be false positive, indicated by negative controls (same labeling was used on MPs from different origins). We observed that mother cell antigens were not always detected on corresponding MPs by direct flow cytometry or latex bead cytometry. Our data highlighted that false positive results could be generated due to antibody aspecificity and that phenotypic characterization of MPs is a difficult field requiring the use of several negative controls.
Miller, Ross A; Mody, Dina R; Tams, Kimberlee C; Thrall, Michael J
2015-11-01
The Papanicolaou (Pap) test has indisputably decreased cervical cancer mortality, as rates have declined by up to 80% in the United States since its implementation. However, the Pap test is considered less sensitive for detecting glandular lesions than for detecting those of squamous origin. Some studies have even suggested an increasing incidence of cervical adenocarcinoma, which may be a consequence of a relatively reduced ability to detect glandular lesions with cervical cancer screening techniques. To evaluate the detection rate of glandular lesions with screening techniques currently used for cervical cancer screening and to provide insight as to which techniques are most efficacious in our study population. We retrospectively reviewed any available cytology, human papillomavirus (HPV), and histologic malignancy data in patients diagnosed with adenocarcinoma in situ and adenocarcinoma from 2 geographically and socioeconomically disparate hospital systems. Identified patients having had a negative/unsatisfactory Pap test within 5 years of adenocarcinoma in situ or adenocarcinoma tissue diagnosis were considered Pap test screening failures. Patients with negative HPV tests on cytology samples were considered HPV screening failures. One hundred thirty cases were identified (age range, 22-93 years); 39 (30%) had no Pap history in our files. Eight of 91 remaining cases (8.8%) were screening failures. The detected sensitivity for identifying adenocarcinoma in situ/adenocarcinoma in this study was 91.2% by cytology alone and 92.3% when incorporating HPV testing. The most common cytologic diagnosis was atypical glandular cells (25 cases), and those diagnosed with adenocarcinoma were 7.4 years older than those diagnosed with adenocarcinoma in situ (50.3 versus 42.9 years). Nine of 24 HPV-tested cases (37.5%) were called atypical squamous cell of undetermined significance on cytology. Our results highlight the importance of combined Pap and HPV cotesting. Although the number of cases identified is relatively small, our data suggest screening for squamous lesions facilitates the recognition of glandular lesions in the cervix. Additionally, increased use of combined Pap and HPV cotesting may decrease detection failure rates with regard to glandular lesions.
Crompot, Emerence; Van Damme, Michael; Duvillier, Hugues; Pieters, Karlien; Vermeesch, Marjorie; Perez-Morga, David; Meuleman, Nathalie; Mineur, Philippe; Bron, Dominique; Lagneaux, Laurence; Stamatopoulos, Basile
2015-01-01
Background Microparticles (MPs), also called microvesicles (MVs) are plasma membrane-derived fragments with sizes ranging from 0.1 to 1μm. Characterization of these MPs is often performed by flow cytometry but there is no consensus on the appropriate negative control to use that can lead to false positive results. Materials and Methods We analyzed MPs from platelets, B-cells, T-cells, NK-cells, monocytes, and chronic lymphocytic leukemia (CLL) B-cells. Cells were purified by positive magnetic-separation and cultured for 48h. Cells and MPs were characterized using the following monoclonal antibodies (CD19,20 for B-cells, CD3,8,5,27 for T-cells, CD16,56 for NK-cells, CD14,11c for monocytes, CD41,61 for platelets). Isolated MPs were stained with annexin-V-FITC and gated between 300nm and 900nm. The latex bead technique was then performed for easy detection of MPs. Samples were analyzed by Transmission (TEM) and Scanning Electron microscopy (SEM). Results Annexin-V positive events within a gate of 300-900nm were detected and defined as MPs. Our results confirmed that the characteristic antigens CD41/CD61 were found on platelet-derived-MPs validating our technique. However, for MPs derived from other cell types, we were unable to detect any antigen, although they were clearly expressed on the MP-producing cells in the contrary of several data published in the literature. Using the latex bead technique, we confirmed detection of CD41,61. However, the apparent expression of other antigens (already deemed positive in several studies) was determined to be false positive, indicated by negative controls (same labeling was used on MPs from different origins). Conclusion We observed that mother cell antigens were not always detected on corresponding MPs by direct flow cytometry or latex bead cytometry. Our data highlighted that false positive results could be generated due to antibody aspecificity and that phenotypic characterization of MPs is a difficult field requiring the use of several negative controls. PMID:25978814
A-Track: A new approach for detection of moving objects in FITS images
NASA Astrophysics Data System (ADS)
Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.
2016-10-01
We have developed a fast, open-source, cross-platform pipeline, called A-Track, for detecting the moving objects (asteroids and comets) in sequential telescope images in FITS format. The pipeline is coded in Python 3. The moving objects are detected using a modified line detection algorithm, called MILD. We tested the pipeline on astronomical data acquired by an SI-1100 CCD with a 1-meter telescope. We found that A-Track performs very well in terms of detection efficiency, stability, and processing time. The code is hosted on GitHub under the GNU GPL v3 license.
2016-03-11
Control and Prevention Evaluation of a National Call Center and a Local Alerts System for Detection of New Cases of Ebola Virus Disease — Guinea, 2014...principally through the use of a telephone alert system. Community members and health facilities report deaths and suspected Ebola cases to local alert ...sensitivity of the national call center with the local alerts system, the CDC country team performed probabilistic record linkage of the combined
Schuchmann, Maike; Siemers, Björn M
2010-09-17
Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure.
Schuchmann, Maike; Siemers, Björn M.
2010-01-01
Background Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. Methodology/Principal Findings We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Conclusions/Significance Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure. PMID:20862252
Classification of ring artifacts for their effective removal using type adaptive correction schemes.
Anas, Emran Mohammad Abu; Lee, Soo Yeol; Hasan, Kamrul
2011-06-01
High resolution tomographic images acquired with a digital X-ray detector are often degraded by the so called ring artifacts. In this paper, a detail analysis including the classification, detection and correction of these ring artifacts is presented. At first, a novel idea for classifying rings into two categories, namely type I and type II rings, is proposed based on their statistical characteristics. The defective detector elements and the dusty scintillator screens result in type I ring and the mis-calibrated detector elements lead to type II ring. Unlike conventional approaches, we emphasize here on the separate detection and correction schemes for each type of rings for their effective removal. For the detection of type I ring, the histogram of the responses of the detector elements is used and a modified fast image inpainting algorithm is adopted to correct the responses of the defective pixels. On the other hand, to detect the type II ring, first a simple filtering scheme is presented based on the fast Fourier transform (FFT) to smooth the sum curve derived form the type I ring corrected projection data. The difference between the sum curve and its smoothed version is then used to detect their positions. Then, to remove the constant bias suffered by the responses of the mis-calibrated detector elements with view angle, an estimated dc shift is subtracted from them. The performance of the proposed algorithm is evaluated using real micro-CT images and is compared with three recently reported algorithms. Simulation results demonstrate superior performance of the proposed technique as compared to the techniques reported in the literature. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C; Michell, R.; Samara, M.
2013-01-01
A radar meteor echo is the radar scattering signature from the free-electrons in a plasma trail generated by entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head-echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF meteor radars (often called all-sky1radars) primarily detect the specular reflection of meteor trails traveling perpendicular to the line of sight of the scattering trail, while High Power and Large Aperture (HPLA) radars efficiently detect meteor head-echoes and, in some cases, non-specular trails. The fact that head-echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are very sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. In addition, the fact that the simultaneous detection of all different scattering mechanisms can be made with the same instrument, rather than requiring assorted different classes of radars, can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER) deployed at the Estacion Astronomica Rio Grande (EARG) in Tierra del Fuego, Argentina. The results presented here are derived from observations performed over a period of 12 days in August 2011, and include meteoroid dynamical parameter distributions, radiants and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.
Ozone and Aerosol Retrieval from Backscattered Ultraviolet Radiation
NASA Technical Reports Server (NTRS)
Bhartia, Pawan K.
2004-01-01
In this presentation we will discuss the techniques to estimate total column ozone and aerosol absorption optical depth from the measurements of backscattered ultraviolet (buv) radiation. The total ozone algorithm has been used to create a unique record of the ozone layer, spanning more than 3 decades, from a series of instruments (BUV, SBUV, TOMS, SBUV/2) flown on NASA, NOAA, Japanese and Russian satellites. We will discuss how this algorithm can be considered a generalization of the well-known Dobson/Brewer technique that has been used to process data from ground-based instruments for many decades, and how it differs from the DOAS techniques that have been used to estimate vertical column densities of a host of trace gases from data collected by GOME and SCIAMACHY instruments. The BUV aerosol algorithm is most suitable for the detection of UV absorbing aerosols (smoke, desert dust, volcanic ash) and is the only technique that can detect aerosols embedded in clouds. This algorithm has been used to create a quarter century record of aerosol absorption optical depth using the BUV data collected by a series of TOMS instruments. We will also discuss how the data from the OM1 instrument launched on July 15,2004 will be combined with data from MODIS and CALIPSO lidar data to enhance the accuracy and information content of satellite-derived aerosol measurements. The OM1 and MODIS instruments are currently flying on EOS Aura and EOS Aqua satellites respectively, part of a constellation of satellites called the "A-train". The CALIPSO satellite is expected to join this constellation in mid 2005.
A new encoding scheme for visible light communications with applications to mobile connections
NASA Astrophysics Data System (ADS)
Benton, David M.; St. John Brittan, Paul
2017-10-01
A new, novel and unconventional encoding scheme called concurrent coding, has recently been demonstrated and shown to offer interesting features and benefits in comparison to conventional techniques, such as robustness against burst errors and improved efficiency of transmitted power. Free space optical communications can suffer particularly from issues of alignment which requires stable, fixed links to be established and beam wander which can interrupt communications. Concurrent coding has the potential to help ease these difficulties and enable mobile, flexible optical communications to be implemented through the use of a source encoding technique. This concept has been applied for the first time to optical communications where standard light emitting diodes (LEDs) have been used to transmit information encoded with concurrent coding. The technique successfully transmits and decodes data despite unpredictable interruptions to the transmission causing significant drop-outs to the detected signal. The technique also shows how it is possible to send a single block of data in isolation with no pre-synchronisation required between transmitter and receiver, and no specific synchronisation sequence appended to the transmission. Such systems are robust against interference - intentional or otherwise - as well as intermittent beam blockage.
Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien
2014-09-01
Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.
Roberson, A.M.; Andersen, D.E.; Kennedy, P.L.
2005-01-01
Broadcast surveys using conspecific calls are currently the most effective method for detecting northern goshawks (Accipiter gentilis) during the breeding season. These surveys typically use alarm calls during the nestling phase and juvenile food-begging calls during the fledgling-dependency phase. Because goshawks are most vocal during the courtship phase, we hypothesized that this phase would be an effective time to detect goshawks. Our objective was to improve current survey methodology by evaluating the probability of detecting goshawks at active nests in northern Minnesota in 3 breeding phases and at 4 broadcast distances and to determine the effective area surveyed per broadcast station. Unlike previous studies, we broadcast calls at only 1 distance per trial. This approach better quantifies (1) the relationship between distance and probability of detection, and (2) the effective area surveyed (EAS) per broadcast station. We conducted 99 broadcast trials at 14 active breeding areas. When pooled over all distances, detection rates were highest during the courtship (70%) and fledgling-dependency phases (68%). Detection rates were lowest during the nestling phase (28%), when there appeared to be higher variation in likelihood of detecting individuals. EAS per broadcast station was 39.8 ha during courtship and 24.8 ha during fledgling-dependency. Consequently, in northern Minnesota, broadcast stations may be spaced 712m and 562 m apart when conducting systematic surveys during courtship and fledgling-dependency, respectively. We could not calculate EAS for the nestling phase because probability of detection was not a simple function of distance from nest. Calculation of EAS could be applied to other areas where the probability of detection is a known function of distance.
Detection of tumor DNA at the margins of colorectal cancer liver metastasis
Holdhoff, Matthias; Schmidt, Kerstin; Diehl, Frank; Aggrawal, Nishant; Angenendt, Philipp; Romans, Katharine; Edelstein, Daniel L.; Torbenson, Michael; Kinzler, Kenneth W.; Vogelstein, Bert; Choti, Michael A.; Diaz, Luis A.
2012-01-01
Purpose Defining an adequate resection margin of colorectal cancer liver metastases is essential for optimizing surgical technique. We have attempted to evaluate the resection margin through a combination of histopathologic and genetic analyses. Experimental Design We evaluated 88 samples of tumor margins from 12 patients with metastatic colon cancer who each underwent partial hepatectomy of one to six liver metastases. Punch biopsies of surrounding liver tissue were obtained at 4, 8, 12 and 16 mm from the tumor border. DNA from these biopsies was analyzed by a sensitive PCR-based technique, called BEAMing, for mutations of KRAS, PIK3CA, APC, or TP53 identified in the corresponding tumor. Results Mutations were identified in each patient’s resected tumor and used to analyze the 88 samples circumscribing the tumor-normal border. Tumor-specific mutant DNA was detectable in surrounding liver tissue in five of these 88 samples, all within 4 mm of the tumor border. Biopsies that were 8, 12, and 16 mm from the macroscopic visible margin were devoid of detectable mutant tumor DNA as well as of microscopically visible cancer cells. Tumors with a significant radiologic response to chemotherapy were not associated with any increase in mutant tumor DNA in beyond 4 mm of the main tumor. Conclusions Mutant tumor-specific DNA can be detected beyond the visible tumor margin, but never beyond 4 mm, even in patients whose tumors were larger prior to chemotherapy. These data provide a rational basis for determining the extent of surgical excision required in patients undergoing resection of liver metastases. PMID:21531819
NASA Astrophysics Data System (ADS)
Derusova, D. A.; Vavilov, V. P.; Pawar, S. S.
2015-04-01
Low velocity impact is a frequently observed event during the operation of an aircraft composite structure. This type of damage is aptly called as “blind-side impact damage” as it is barely visible as a dent on the impacted surface, but may produce extended delaminations closer to the rear surface. One-sided thermal nondestructive testing is considered as a promising technique for detecting impact damage but because of diffusive nature of optical thermal signals there is drop in detectability of deeper subsurface defects. Ultrasonic Infrared thermography is a potentially attractive nondestructive evaluation technique used to detect the defects through observation of vibration-induced heat generation. Evaluation of the energy released by such defects is a challenging task. In this study, the thin delaminations caused by impact damage in composites and which are subjected to ultrasonic excitation are considered as local heat sources. The actual impact damage in a carbon epoxy composite which was detected by applying a magnetostrictive ultrasonic device is then modeled as a pyramid-like defect with a set of delaminations acting as an air-filled heat sources. The temperature rise expected on the surface of the specimen was achieved by varying energy contribution from each delamination through trial and error. Finally, by comparing the experimental temperature elevations in defective area with the results of temperature simulations, we estimated the energy generated by each defect and defect power of impact damage as a whole. The results show good correlation between simulations and measurements, thus validating the simulation approach.
Acoustic Manifestations of Natural versus Triggered Lightning
NASA Astrophysics Data System (ADS)
Arechiga, R. O.; Johnson, J. B.; Edens, H. E.; Rison, W.; Thomas, R. J.; Eack, K.; Eastvedt, E. M.; Aulich, G. D.; Trueblood, J.
2010-12-01
Positive leaders are rarely detected by VHF lightning detection systems; positive leader channels are usually outlined only by recoil events. Positive cloud-to-ground (CG) channels are usually not mapped. The goal of this work is to study the types of thunder produced by natural versus triggered lightning and to assess which types of thunder signals have electromagnetic activity detected by the lightning mapping array (LMA). Towards this end we are investigating the lightning detection capabilities of acoustic techniques, and comparing them with the LMA. In a previous study we used array beam forming and time of flight information to locate acoustic sources associated with lightning. Even though there was some mismatch, generally LMA and acoustic techniques saw the same phenomena. To increase the database of acoustic data from lightning, we deployed a network of three infrasound arrays (30 m aperture) during the summer of 2010 (August 3 to present) in the Magdalena mountains of New Mexico, to monitor infrasound (below 20 Hz) and audio range sources due to natural and triggered lightning. The arrays were located at a range of distances (60 to 1400 m) surrounding the triggering site, called the Kiva, used by Langmuir Laboratory to launch rockets. We have continuous acoustic measurements of lightning data from July 20 to September 18 of 2009, and from August 3 to September 1 of 2010. So far, lightning activity around the Kiva was higher during the summer of 2009. We will present acoustic data from several interesting lightning flashes including a comparison between a natural and a triggered one.
A Machine Learning Ensemble Classifier for Early Prediction of Diabetic Retinopathy.
S K, Somasundaram; P, Alli
2017-11-09
The main complication of diabetes is Diabetic retinopathy (DR), retinal vascular disease and it leads to the blindness. Regular screening for early DR disease detection is considered as an intensive labor and resource oriented task. Therefore, automatic detection of DR diseases is performed only by using the computational technique is the great solution. An automatic method is more reliable to determine the presence of an abnormality in Fundus images (FI) but, the classification process is poorly performed. Recently, few research works have been designed for analyzing texture discrimination capacity in FI to distinguish the healthy images. However, the feature extraction (FE) process was not performed well, due to the high dimensionality. Therefore, to identify retinal features for DR disease diagnosis and early detection using Machine Learning and Ensemble Classification method, called, Machine Learning Bagging Ensemble Classifier (ML-BEC) is designed. The ML-BEC method comprises of two stages. The first stage in ML-BEC method comprises extraction of the candidate objects from Retinal Images (RI). The candidate objects or the features for DR disease diagnosis include blood vessels, optic nerve, neural tissue, neuroretinal rim, optic disc size, thickness and variance. These features are initially extracted by applying Machine Learning technique called, t-distributed Stochastic Neighbor Embedding (t-SNE). Besides, t-SNE generates a probability distribution across high-dimensional images where the images are separated into similar and dissimilar pairs. Then, t-SNE describes a similar probability distribution across the points in the low-dimensional map. This lessens the Kullback-Leibler divergence among two distributions regarding the locations of the points on the map. The second stage comprises of application of ensemble classifiers to the extracted features for providing accurate analysis of digital FI using machine learning. In this stage, an automatic detection of DR screening system using Bagging Ensemble Classifier (BEC) is investigated. With the help of voting the process in ML-BEC, bagging minimizes the error due to variance of the base classifier. With the publicly available retinal image databases, our classifier is trained with 25% of RI. Results show that the ensemble classifier can achieve better classification accuracy (CA) than single classification models. Empirical experiments suggest that the machine learning-based ensemble classifier is efficient for further reducing DR classification time (CT).
Covert Channels in SIP for VoIP Signalling
NASA Astrophysics Data System (ADS)
Mazurczyk, Wojciech; Szczypiorski, Krzysztof
In this paper, we evaluate available steganographic techniques for SIP (Session Initiation Protocol) that can be used for creating covert channels during signaling phase of VoIP (Voice over IP) call. Apart from characterizing existing steganographic methods we provide new insights by introducing new techniques. We also estimate amount of data that can be transferred in signalling messages for typical IP telephony call.
NASA Technical Reports Server (NTRS)
Shaklan, Stuart; Pan, Xiaopei
2004-01-01
The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.
NASA Astrophysics Data System (ADS)
Hložek, M.; Trojek, T.
2017-08-01
Archaeological surveys and metal detector prospecting yield a great amount of coins from the medieval period. Naturally, some of these are counterfeit which an experienced numismatist can determine without using chemical methods. The production of counterfeit coins in the middle ages took place in castles, caves or other remote areas where waste from this activity can still be found today - copper sheets, technical ceramics and counterfeit coins. Until recently, it has been assumed that medieval counterfeit coins are made by silver-plating copper blanks using an amalgam. However, the performed analyses reveal that there are many more techniques of counterfeiting of coins. Other techniques were based on e.g. tin amalgam plating of the blanks or alloying so-called white metal with silver-like appearance from which the coins were minted. Current chemical analyses indicate that the coins were often tinned by hot dipping with no amalgamation. Micro-X-ray fluorescence analysis has been chosen as a suitable non-destructive method to identify present chemical elements in investigated artifacts and to quantify their concentrations. In addition, a quick technique telltale the plating was applied. This technique utilizes the detected fluorescence ratio Kα/Kβ of copper, which is the main ingredient of a lot of historical metallic materials.
Harnessing color vision for visual oximetry in central cyanosis.
Changizi, Mark; Rio, Kevin
2010-01-01
Central cyanosis refers to a bluish discoloration of the skin, lips, tongue, nails, and mucous membranes, and is due to poor arterial oxygenation. Although skin color is one of its characteristic properties, it has long been realized that by the time skin color signs become visible, oxygen saturation is dangerously low. Here we investigate the visibility of cyanosis in light of recent discoveries on what color vision evolved for in primates. We elucidate why low arterial oxygenation is visible at all, why it is perceived as blue, and why it can be so difficult to perceive. With a better understanding of the relationship between color vision and blood physiology, we suggest two simple techniques for greatly enhancing the clinician's ability to detect cyanosis and other clinical color changes. The first is called "skin-tone adaptation", wherein sheets, gowns, walls and other materials near a patient have a color close to that of the patient's skin, thereby optimizing a color-normal viewer's ability to sense skin color modulations. The second technique is called "biosensor color tabs", wherein adhesive tabs with a color matching the patient's skin tone are placed in several spots on the skin, and subsequent skin color changes have the effect of making the initially-invisible tabs change color, their hue and saturation indicating the direction and magnitude of the skin color shift.
On the Inference of Functional Circadian Networks Using Granger Causality
Pourzanjani, Arya; Herzog, Erik D.; Petzold, Linda R.
2015-01-01
Being able to infer one way direct connections in an oscillatory network such as the suprachiastmatic nucleus (SCN) of the mammalian brain using time series data is difficult but crucial to understanding network dynamics. Although techniques have been developed for inferring networks from time series data, there have been no attempts to adapt these techniques to infer directional connections in oscillatory time series, while accurately distinguishing between direct and indirect connections. In this paper an adaptation of Granger Causality is proposed that allows for inference of circadian networks and oscillatory networks in general called Adaptive Frequency Granger Causality (AFGC). Additionally, an extension of this method is proposed to infer networks with large numbers of cells called LASSO AFGC. The method was validated using simulated data from several different networks. For the smaller networks the method was able to identify all one way direct connections without identifying connections that were not present. For larger networks of up to twenty cells the method shows excellent performance in identifying true and false connections; this is quantified by an area-under-the-curve (AUC) 96.88%. We note that this method like other Granger Causality-based methods, is based on the detection of high frequency signals propagating between cell traces. Thus it requires a relatively high sampling rate and a network that can propagate high frequency signals. PMID:26413748
NASA Astrophysics Data System (ADS)
Clenet, A.; Ravera, L.; Bertrand, B.; den Hartog, R.; Jackson, B.; van Leeuwen, B.-J.; van Loon, D.; Parot, Y.; Pointecouteau, E.; Sournac, A.
2014-11-01
IRAP is developing the readout electronics of the SPICA-SAFARI's TES bolometer arrays. Based on the frequency domain multiplexing technique the readout electronics provides the AC-signals to voltage-bias the detectors; it demodulates the data; and it computes a feedback to linearize the detection chain. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several μ s) and with fast signals (i.e. frequency carriers of the order of 5 MHz). To optimize the power consumption we took advantage of the reduced science signal bandwidth to decouple the signal sampling frequency and the data processing rate. This technique allowed a reduction of the power consumption of the circuit by a factor of 10. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed, to operate an array TES one has to properly define about 21000 parameters. We defined a set of procedures to automatically characterize these parameters and find out the optimal settings.
Real-time estimation of wildfire perimeters from curated crowdsourcing
NASA Astrophysics Data System (ADS)
Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin
2016-04-01
Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available “curated” crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires.
Real-time estimation of wildfire perimeters from curated crowdsourcing
Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin
2016-01-01
Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available “curated” crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires. PMID:27063569
Real-time estimation of wildfire perimeters from curated crowdsourcing.
Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin
2016-04-11
Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available "curated" crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires.
77 FR 56710 - Proposed Information Collection (Call Center Satisfaction Survey): Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-13
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0744] Proposed Information Collection (Call Center Satisfaction Survey): Comment Request AGENCY: Veterans Benefits Administration, Department of... techniques or the use of other forms of information technology. Title: VBA Call Center Satisfaction Survey...
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
Liu, Ruxiu; Wang, Ningquan; Kamili, Farhan; Sarioglu, A Fatih
2016-04-21
Numerous biophysical and biochemical assays rely on spatial manipulation of particles/cells as they are processed on lab-on-a-chip devices. Analysis of spatially distributed particles on these devices typically requires microscopy negating the cost and size advantages of microfluidic assays. In this paper, we introduce a scalable electronic sensor technology, called microfluidic CODES, that utilizes resistive pulse sensing to orthogonally detect particles in multiple microfluidic channels from a single electrical output. Combining the techniques from telecommunications and microfluidics, we route three coplanar electrodes on a glass substrate to create multiple Coulter counters producing distinct orthogonal digital codes when they detect particles. We specifically design a digital code set using the mathematical principles of Code Division Multiple Access (CDMA) telecommunication networks and can decode signals from different microfluidic channels with >90% accuracy through computation even if these signals overlap. As a proof of principle, we use this technology to detect human ovarian cancer cells in four different microfluidic channels fabricated using soft lithography. Microfluidic CODES offers a simple, all-electronic interface that is well suited to create integrated, low-cost lab-on-a-chip devices for cell- or particle-based assays in resource-limited settings.
Detection of carbon monoxide (CO) as a furnace byproduct using a rotating mask spectrometer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinclair, Michael B.; Flemming, Jeb Hunter; Blair, Raymond
2006-02-01
Sandia National Laboratories, in partnership with the Consumer Product Safety Commission (CPSC), has developed an optical-based sensor for the detection of CO in appliances such as residential furnaces. The device is correlation radiometer based on detection of the difference signal between the transmission spectrum of the sample multiplied by two alternating synthetic spectra (called Eigen spectra). These Eigen spectra are derived from a priori knowledge of the interferents present in the exhaust stream. They may be determined empirically for simple spectra, or using a singular value decomposition algorithm for more complex spectra. Data is presented on the details of themore » design of the instrument and Eigen spectra along with results from detection of CO in background N{sub 2}, and CO in N{sub 2} with large quantities of interferent CO{sub 2}. Results indicate that using the Eigen spectra technique, CO can be measured at levels well below acceptable limits in the presence of strongly interfering species. In addition, a conceptual design is presented for reducing the complexity and cost of the instrument to a level compatible with consumer products.« less
Laboratory insights into the detection of surface biosignatures by remote-sensing techniques
NASA Astrophysics Data System (ADS)
Poch, O.; Pommerol, A.; Jost, B.; Roditi, I.; Frey, J.; Thomas, N.
2014-03-01
With the progress of direct imaging techniques, it will be possible in the short or long-term future to retrieve more efficiently the information on the physical properties of the light reflected by rocky exoplanets (Traub et al., 2010). The search for visible-infrared absorption bands of peculiar gases (O2, CH4 etc.) in this light could give clues for the presence of life (Kaltenegger and Selsis, 2007). Even more uplifting would be the direct detection of life itself, on the surface of an exoplanet. Considering this latter possibility, what is the potential of optical remote-sensing methods to detect surface biosignatures? Reflected light from the surface of the Earth exhibits a strong surface biosignature in the form of an abrupt change of reflectance between the visible and infrared range of the spectrum (Seager et al., 2005). This spectral feature called "vegetation red-edge" is possibly the consequence of biological evolution selecting the right chemical structures enabling the plants to absorb the visible energy, while preventing them from overheating by reflecting more efficiently the infrared. Such red-edge is also found in primitive photosynthetic bacteria, cyanobacteria, that colonized the surface of the Earth ocean and continents billions of years before multicellular plants (Knacke, 2003). If life ever arose on an Earth-like exoplanet, one could hypothesize that some form of its surface-life evolves into similar photo-active organisms, also exhibiting a red-edge. In this paper, we will present our plan and preliminary results of a laboratory study aiming at precising the potentiality of remote sensing techniques in detecting such surface biosignatures. Using equipment that has been developed in our team for surface photometry studies (Pommerol 2011, Jost 2013, Pommerol 2013), we will investigate the reflectance spectra and bidirectional reflectance function of soils containing bacteria such as cyanobacteria, in various environmental conditions. We will also present our plan to incorporate polarization measurements, and particularly circular polarization, because it can be a marker of homochirality, which is supposed to be a universal property of life. Finally, the analyses of both biotic and abiotic materials will help to assess if (or in which peculiar conditions) remote sensing techniques can discriminate between false positive and strong biomarkers. Ultimately, these laboratory data can serve as reference data to guide and interpret future observations, paving the way for the detection of life on distant exoplanets.
Bat detective-Deep learning tools for bat acoustic signal detection.
Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E
2018-03-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.
A proposed technique for vehicle tracking, direction, and speed determination
NASA Astrophysics Data System (ADS)
Fisher, Paul S.; Angaye, Cleopas O.; Fisher, Howard P.
2004-12-01
A technique for recognition of vehicles in terms of direction, distance, and rate of change is presented. This represents very early work on this problem with significant hurdles still to be addressed. These are discussed in the paper. However, preliminary results also show promise for this technique for use in security and defense environments where the penetration of a perimeter is of concern. The material described herein indicates a process whereby the protection of a barrier could be augmented by computers and installed cameras assisting the individuals charged with this responsibility. The technique we employ is called Finite Inductive Sequences (FI) and is proposed as a means for eliminating data requiring storage and recognition where conventional mathematical models don"t eliminate enough and statistical models eliminate too much. FI is a simple idea and is based upon a symbol push-out technique that allows the order (inductive base) of the model to be set to an a priori value for all derived rules. The rules are obtained from exemplar data sets, and are derived by a technique called Factoring, yielding a table of rules called a Ruling. These rules can then be used in pattern recognition applications such as described in this paper.
Single Molecule Sensing by Nanopores and Nanopore Devices
Gu, Li-Qun; Shim, Ji Wook
2010-01-01
Molecular-scale pore structures, called nanopores, can be assembled by protein ion channels through genetic engineering or be artificially fabricated on solid substrates using fashion nanotechnology. When target molecules interact with the functionalized lumen of a nanopore, they characteristically block the ion pathway. The resulting conductance changes allow for identification of single molecules and quantification of target species in the mixture. In this review, we first overview nanopore-based sensory techniques that have been created for the detection of myriad biomedical targets, from metal ions, drug compounds, and cellular second messengers to proteins and DNA. Then we introduce our recent discoveries in nanopore single molecule detection: (1) using the protein nanopore to study folding/unfolding of the G-quadruplex aptamer; (2) creating a portable and durable biochip that is integrated with a single-protein pore sensor (this chip is compared with recently developed protein pore sensors based on stabilized bilayers on glass nanopore membranes and droplet interface bilayer); and (3) creating a glass nanopore-terminated probe for single-molecule DNA detection, chiral enantiomer discrimination, and identification of the bioterrorist agent ricin with an aptamer-encoded nanopore. PMID:20174694
A New Random Walk for Replica Detection in WSNs.
Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.
Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min
2015-06-01
The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.
A New Random Walk for Replica Detection in WSNs
Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082
Anazawa, Takashi; Uchiho, Yuichi; Yokoi, Takahide; Chalkidis, George; Yamazaki, Motohiro
2017-06-27
A five-color fluorescence-detection system for eight-channel plastic-microchip electrophoresis was developed. In the eight channels (with effective electrophoretic lengths of 10 cm), single-stranded DNA fragments were separated (with single-base resolution up to 300 bases within 10 min), and seventeen-loci STR genotyping for forensic human identification was successfully demonstrated. In the system, a side-entry laser beam is passed through the eight channels (eight A channels), with alternately arrayed seven sacrificial channels (seven B channels), by a technique called "side-entry laser-beam zigzag irradiation." Laser-induced fluorescence from the eight A channels and Raman-scattered light from the seven B channels are then simultaneously, uniformly, and spectroscopically detected, in the direction perpendicular to the channel array plane, through a transmission grating and a CCD camera. The system is therefore simple and highly sensitive. Because the microchip is fabricated by plastic-injection molding, it is inexpensive and disposable and thus suitable for actual use in various fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babun, Leonardo; Aksu, Hidayet; Uluagac, A. Selcuk
The core vision of the smart grid concept is the realization of reliable two-way communications between smart devices (e.g., IEDs, PLCs, PMUs). The benefits of the smart grid also come with tremendous security risks and new challenges in protecting the smart grid systems from cyber threats. Particularly, the use of untrusted counterfeit smart grid devices represents a real problem. Consequences of propagating false or malicious data, as well as stealing valuable user or smart grid state information from counterfeit devices are costly. Hence, early detection of counterfeit devices is critical for protecting smart grid’s components and users. To address thesemore » concerns, in this poster, we introduce our initial design of a configurable framework that utilize system call tracing, library interposition, and statistical techniques for monitoring and detection of counterfeit smart grid devices. In our framework, we consider six different counterfeit device scenarios with different smart grid devices and adversarial seZings. Our initial results on a realistic testbed utilizing actual smart-grid GOOSE messages with IEC-61850 communication protocol are very promising. Our framework is showing excellent rates on detection of smart grid counterfeit devices from impostors.« less
Forest Fire Finder - DOAS application to long-range forest fire detection
NASA Astrophysics Data System (ADS)
Valente de Almeida, Rui; Vieira, Pedro
2017-06-01
Fires are an important factor in shaping Earth's ecosystems. Plant and animal life, in almost every land habitat, are at least partially dependent on the effects of fire. However, their destructive force, which has often proven uncontrollable, is one of our greatest concerns, effectively resulting in several policies in the most important industrialised regions of the globe. This paper aims to comprehensively characterise the Forest Fire Finder (FFF), a forest fire detection system based mainly upon a spectroscopic technique called differential optical absorption spectroscopy (DOAS). The system is designed and configured with the goal of detecting higher-than-the-horizon smoke columns by measuring and comparing scattered sunlight spectra. The article covers hardware and software, as well as their interactions and specific algorithms for day mode operation. An analysis of data retrieved from several installations deployed in the course of the last 5 years is also presented. Finally, this paper features a discussion on the most prominent future improvements planned for the system, as well as its ramifications and adaptations, such as a thermal imaging system for short-range fire seeking or environmental quality control.
Le Bras, Ronan J; Kuzma, Heidi; Sucic, Victor; Bokelmann, Götz
2016-05-01
A notable sequence of calls was encountered, spanning several days in January 2003, in the central part of the Indian Ocean on a hydrophone triplet recording acoustic data at a 250 Hz sampling rate. This paper presents signal processing methods applied to the waveform data to detect, group, extract amplitude and bearing estimates for the recorded signals. An approximate location for the source of the sequence of calls is inferred from extracting the features from the waveform. As the source approaches the hydrophone triplet, the source level (SL) of the calls is estimated at 187 ± 6 dB re: 1 μPa-1 m in the 15-60 Hz frequency range. The calls are attributed to a subgroup of blue whales, Balaenoptera musculus, with a characteristic acoustic signature. A Bayesian location method using probabilistic models for bearing and amplitude is demonstrated on the calls sequence. The method is applied to the case of detection at a single triad of hydrophones and results in a probability distribution map for the origin of the calls. It can be extended to detections at multiple triads and because of the Bayesian formulation, additional modeling complexity can be built-in as needed.
Adaptive Quadrature Detection for Multicarrier Continuous-Variable Quantum Key Distribution
NASA Astrophysics Data System (ADS)
Gyongyosi, Laszlo; Imre, Sandor
2015-03-01
We propose the adaptive quadrature detection for multicarrier continuous-variable quantum key distribution (CVQKD). A multicarrier CVQKD scheme uses Gaussian subcarrier continuous variables for the information conveying and Gaussian sub-channels for the transmission. The proposed multicarrier detection scheme dynamically adapts to the sub-channel conditions using a corresponding statistics which is provided by our sophisticated sub-channel estimation procedure. The sub-channel estimation phase determines the transmittance coefficients of the sub-channels, which information are used further in the adaptive quadrature decoding process. We define the technique called subcarrier spreading to estimate the transmittance conditions of the sub-channels with a theoretical error-minimum in the presence of a Gaussian noise. We introduce the terms of single and collective adaptive quadrature detection. We also extend the results for a multiuser multicarrier CVQKD scenario. We prove the achievable error probabilities, the signal-to-noise ratios, and quantify the attributes of the framework. The adaptive detection scheme allows to utilize the extra resources of multicarrier CVQKD and to maximize the amount of transmittable information. This work was partially supported by the GOP-1.1.1-11-2012-0092 (Secure quantum key distribution between two units on optical fiber network) project sponsored by the EU and European Structural Fund, and by the COST Action MP1006.
Censoring approach to the detection limits in X-ray fluorescence analysis
NASA Astrophysics Data System (ADS)
Pajek, M.; Kubala-Kukuś, A.
2004-10-01
We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called "nondetects", can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples.
Chaudhuri, Shomesh E; Merfeld, Daniel M
2013-03-01
Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.
NASA Astrophysics Data System (ADS)
Mercado, Karla Patricia E.
Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.
Lidar Measurements of Tropospheric Wind Profiles with the Double Edge Technique
NASA Technical Reports Server (NTRS)
Gentry, Bruce M.; Li, Steven X.; Korb, C. Laurence; Mathur, Savyasachee; Chen, Huailin
1998-01-01
Research has established the importance of global tropospheric wind measurements for large scale improvements in numerical weather prediction. In addition, global wind measurements provide data that are fundamental to the understanding and prediction of global climate change. These tasks are closely linked with the goals of the NASA Earth Science Enterprise and Global Climate Change programs. NASA Goddard has been actively involved in the development of direct detection Doppler lidar methods and technologies to meet the wind observing needs of the atmospheric science community. A variety of direct detection Doppler wind lidar measurements have recently been reported indicating the growing interest in this area. Our program at Goddard has concentrated on the development of the edge technique for lidar wind measurements. Implementations of the edge technique using either the aerosol or molecular backscatter for the Doppler wind measurement have been described. The basic principles have been verified in lab and atmospheric lidar wind experiments. The lidar measurements were obtained with an aerosol edge technique lidar operating at 1064 nm. These measurements demonstrated high spatial resolution (22 m) and high velocity sensitivity (rms variances of 0.1 m/s) in the planetary boundary layer (PBL). The aerosol backscatter is typically high in the PBL and the effects of the molecular backscatter can often be neglected. However, as was discussed in the original edge technique paper, the molecular contribution to the signal is significant above the boundary layer and a correction for the effects of molecular backscatter is required to make wind measurements. In addition, the molecular signal is a dominant source of noise in regions where the molecular to aerosol ratio is large since the energy monitor channel used in the single edge technique measures the sum of the aerosol and molecular signals. To extend the operation of the edge technique into the free troposphere we have developed a variation of the edge technique called the double edge technique. In this paper a ground based aerosol double edge lidar is described and the first measurements of wind profiles in the free troposphere obtained with this lidar will be presented.
Progress in development of neutron energy spectrometer for deuterium plasma operation in KSTAR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomita, H., E-mail: tomita@nagoya-u.jp; Yamashita, F.; Nakayama, Y.
2014-11-15
Two types of DD neutron energy spectrometer (NES) are under development for deuterium plasma operation in KSTAR to understand behavior of beam ions in the plasma. One is based on the state-of-the-art nuclear emulsion technique. The other is based on a coincidence detection of a recoiled proton and a scattered neutron caused by an elastic scattering of an incident DD neutron, which is called an associated particle coincidence counting-NES. The prototype NES systems were installed at J-port in KSTAR in 2012. During the 2012 and 2013 experimental campaigns, multiple shots-integrated neutron spectra were preliminarily obtained by the nuclear emulsion-based NESmore » system.« less
Instrumentation For Diffraction Enhanced Imaging Experiments At HASYLAB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lohmann, M.; Dix, W.-R.; Metge, J.
The new X-ray radiography imaging technique, named diffraction enhanced imaging (DEI), enables almost scatter free absorption imaging, the production of the so-called refraction images of a sample. The images show improved contrast compared to standard imaging applications. At the HASYLAB wiggler beamline W2 at the 2nd-generation storage ring DORIS a 5cm wide beam with an adjustable energy between 10 and 70keV is available. A Si [111] pre-monochromator is used followed by the main monochromator using the (111) or the (333)-reflection. Visualization of fossils, detecting internal pearl structures, monitoring of bone and cartilage and documentation of implant healing in bone aremore » application examples at HASYLAB.« less
Time-resolved wide-field optically sectioned fluorescence microscopy
NASA Astrophysics Data System (ADS)
Dupuis, Guillaume; Benabdallah, Nadia; Chopinaud, Aurélien; Mayet, Céline; Lévêque-Fort, Sandrine
2013-02-01
We present the implementation of a fast wide-field optical sectioning technique called HiLo microscopy on a fluorescence lifetime imaging microscope. HiLo microscopy is based on the fusion of two images, one with structured illumination and another with uniform illumination. Optically sectioned images are then digitally generated thanks to a fusion algorithm. HiLo images are comparable in quality with confocal images but they can be acquired faster over larger fields of view. We obtain 4D imaging by combining HiLo optical sectioning, time-gated detection, and z-displacement. We characterize the performances of this set-up in terms of 3D spatial resolution and time-resolved capabilities in both fixed- and live-cell imaging modes.
Progress in development of neutron energy spectrometer for deuterium plasma operation in KSTARa)
NASA Astrophysics Data System (ADS)
Tomita, H.; Yamashita, F.; Nakayama, Y.; Morishima, K.; Yamamoto, Y.; Sakai, Y.; Cheon, M. S.; Isobe, M.; Ogawa, K.; Hayashi, S.; Kawarabayashi, J.; Iguchi, T.
2014-11-01
Two types of DD neutron energy spectrometer (NES) are under development for deuterium plasma operation in KSTAR to understand behavior of beam ions in the plasma. One is based on the state-of-the-art nuclear emulsion technique. The other is based on a coincidence detection of a recoiled proton and a scattered neutron caused by an elastic scattering of an incident DD neutron, which is called an associated particle coincidence counting-NES. The prototype NES systems were installed at J-port in KSTAR in 2012. During the 2012 and 2013 experimental campaigns, multiple shots-integrated neutron spectra were preliminarily obtained by the nuclear emulsion-based NES system.
Fire safety concerns in space operations
NASA Technical Reports Server (NTRS)
Friedman, Robert
1987-01-01
This paper reviews the state-of-the-art in fire control techniques and identifies important issues for continuing research, technology, and standards. For the future permanent orbiting facility, the space station, fire prevention and control calls for not only more stringent fire safety due to the long-term and complex missions, but also for simplified and flexible safety rules to accommodate the variety of users. Future research must address a better understanding of the microgravity space environment as it influences fire propagation and extinction and the application of the technology of fire detection, extinguishment, and material assessment. Spacecraft fire safety should also consider the adaptation of methods and concepts derived from aircraft and undersea experience.
EMGAN: A computer program for time and frequency domain reduction of electromyographic data
NASA Technical Reports Server (NTRS)
Hursta, W. N.
1975-01-01
An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb
2018-01-01
A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.
SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon Rueff; Lyle Roybal; Denis Vollmer
2013-01-01
There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on howmore » to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.« less
AI in CALL--Artificially Inflated or Almost Imminent?
ERIC Educational Resources Information Center
Schulze, Mathias
2008-01-01
The application of techniques from artificial intelligence (AI) to CALL has commonly been referred to as intelligent CALL (ICALL). ICALL is only slightly older than the "CALICO Journal", and this paper looks back at a quarter century of published research mainly in North America and by North American scholars. This "inventory…
NASA Astrophysics Data System (ADS)
Vega, David; Kiekens, Kelli C.; Syson, Nikolas C.; Romano, Gabriella; Baker, Tressa; Barton, Jennifer K.
2018-02-01
While Optical Coherence Microscopy (OCM), Multiphoton Microscopy (MPM), and narrowband imaging are powerful imaging techniques that can be used to detect cancer, each imaging technique has limitations when used by itself. Combining them into an endoscope to work in synergy can help achieve high sensitivity and specificity for diagnosis at the point of care. Such complex endoscopes have an elevated risk of failure, and performing proper modelling ensures functionality and minimizes risk. We present full 2D and 3D models of a multimodality optical micro-endoscope to provide real-time detection of carcinomas, called a salpingoscope. The models evaluate the endoscope illumination and light collection capabilities of various modalities. The design features two optical paths with different numerical apertures (NA) through a single lens system with a scanning optical fiber. The dual path is achieved using dichroic coatings embedded in a triplet. A high NA optical path is designed to perform OCM and MPM while a low NA optical path is designed for the visible spectrum to navigate the endoscope to areas of interest and narrowband imaging. Different tests such as the reflectance profile of homogeneous epithelial tissue were performed to adjust the models properly. Light collection models for the different modalities were created and tested for efficiency. While it is challenging to evaluate the efficiency of multimodality endoscopes, the models ensure that the system is design for the expected light collection levels to provide detectable signal to work for the intended imaging.
NASA Astrophysics Data System (ADS)
Doc Richardson, C.; Hinman, Nancy W.; Scott, Jill R.
2009-10-01
With the discovery of Na-sulphate minerals on Mars and Europa, recent studies using these minerals have focused on their ability to assist in the detection of bio/organic signatures. This study further investigates the ability of thenardite (Na2SO4) to effectively facilitate the ionization and identification of aromatic amino acids (phenylalanine, tyrosine and tryptophan) using a technique called geomatrix-assisted laser desorption/ionization in conjunction with a Fourier transform ion cyclotron resonance mass spectrometry. This technique is based on the ability of a mineral host to facilitate desorption and ionization of bio/organic molecules for detection. Spectra obtained from each aromatic amino acid alone and in combination with thenardite show differences in ionization mechanism and fragmentation patterns. These differences are due to chemical and structural differences between the aromatic side chains of their respective amino acid. Tyrosine and tryptophan when combined with thenardite were observed to undergo cation-attachment ([M+Na]+), due to the high alkali ion affinity of their aromatic side chains. In addition, substitution of the carboxyl group hydrogen by sodium led to formation of [M-H+Na]Na+ peaks. In contrast, phenylalanine mixed with thenardite showed no evidence of Na+ attachment. Understanding how co-deposition of amino acids with thenardite can affect the observed mass spectra is important for future exploration missions that are likely to use laser desorption mass spectrometry to search for bio/organic compounds in extraterrestrial environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Doc Richardson; Nancy W. Hinman; Jill R. Scott
2009-10-01
With the discovery of Na-sulfate minerals on Mars and Europa, recent studies using these minerals have focused on their ability to assist in the detection of bio/organic signatures. This study further investigates the ability of thenardite (Na2SO4) to effectively facilitate the ionization and identification of aromatic amino acids (phenylalanine, tyrosine, and tryptophan) using a technique called geomatrix-assisted laser desorption/ionization (GALDI) in conjunction with a Fourier transform mass spectrometry (FTICR-MS). This technique is based on the ability of a mineral host to facilitate the ionization and detection of bio/organic molecules. Spectra obtained from each aromatic amino acid alone and in combinationmore » with thenardite show differences in ionization mechanism and fragmentation patterns. These differences are due to chemical and structural differences between the aromatic side chains of their respective amino acid. Tyrosine and tryptophan when combined with thenardite were observed to undergo cation-attachment ([M+Na]+), due to the high alkali affinity of their aromatic side chains. Subsequent cation substitution of the carboxyl group led to formation double cation-attached peaks ([M-H+Na]Na+). In contrast, phenylalanine mixed with thenardite showed no evidence of Na+ interaction. Understanding how codeposition of amino acids with thenardite can affect the observed mass spectra is important for future exploration missions that are likely to use laser desorption mass spectrometry to search for bio/organic compounds in extraterrestrial environments.« less
Shabangu, Fannie W.; Yemane, Dawit; Stafford, Kathleen M.; Ensor, Paul; Findlay, Ken P.
2017-01-01
Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species. PMID:28222124
Shabangu, Fannie W; Yemane, Dawit; Stafford, Kathleen M; Ensor, Paul; Findlay, Ken P
2017-01-01
Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species.
Structural Damage Detection Using Changes in Natural Frequencies: Theory and Applications
NASA Astrophysics Data System (ADS)
He, K.; Zhu, W. D.
2011-07-01
A vibration-based method that uses changes in natural frequencies of a structure to detect damage has advantages over conventional nondestructive tests in detecting various types of damage, including loosening of bolted joints, using minimum measurement data. Two major challenges associated with applications of the vibration-based damage detection method to engineering structures are addressed: accurate modeling of structures and the development of a robust inverse algorithm to detect damage, which are defined as the forward and inverse problems, respectively. To resolve the forward problem, new physics-based finite element modeling techniques are developed for fillets in thin-walled beams and for bolted joints, so that complex structures can be accurately modeled with a reasonable model size. To resolve the inverse problem, a logistical function transformation is introduced to convert the constrained optimization problem to an unconstrained one, and a robust iterative algorithm using a trust-region method, called the Levenberg-Marquardt method, is developed to accurately detect the locations and extent of damage. The new methodology can ensure global convergence of the iterative algorithm in solving under-determined system equations and deal with damage detection problems with relatively large modeling error and measurement noise. The vibration-based damage detection method is applied to various structures including lightning masts, a space frame structure and one of its components, and a pipeline. The exact locations and extent of damage can be detected in the numerical simulation where there is no modeling error and measurement noise. The locations and extent of damage can be successfully detected in experimental damage detection.
Ebai, Tonge; Souza de Oliveira, Felipe Marques; Löf, Liza; Wik, Lotta; Schweiger, Caroline; Larsson, Anders; Keilholtz, Ulrich; Haybaeck, Johannes; Landegren, Ulf; Kamali-Moghaddam, Masood
2017-09-01
Detecting proteins at low concentrations in plasma is crucial for early diagnosis. Current techniques in clinical routine, such as sandwich ELISA, provide sensitive protein detection because of a dependence on target recognition by pairs of antibodies, but detection of still lower protein concentrations is often called for. Proximity ligation assay with rolling circle amplification (PLARCA) is a modified proximity ligation assay (PLA) for analytically specific and sensitive protein detection via binding of target proteins by 3 antibodies, and signal amplification via rolling circle amplification (RCA) in microtiter wells, easily adapted to instrumentation in use in hospitals. Proteins captured by immobilized antibodies were detected using a pair of oligonucleotide-conjugated antibodies. Upon target recognition these PLA probes guided oligonucleotide ligation, followed by amplification via RCA of circular DNA strands that formed in the reaction. The RCA products were detected by horseradish peroxidase-labeled oligonucleotides to generate colorimetric reaction products with readout in an absorbance microplate reader. We compared detection of interleukin (IL)-4, IL-6, IL-8, p53, and growth differentiation factor 15 (GDF-15) by PLARCA and conventional sandwich ELISA or immuno-RCA. PLARCA detected lower concentrations of proteins and exhibited a broader dynamic range compared to ELISA and iRCA using the same antibodies. IL-4 and IL-6 were detected in clinical samples at femtomolar concentrations, considerably lower than for ELISA. PLARCA offers detection of lower protein levels and increased dynamic ranges compared to ELISA. The PLARCA procedure may be adapted to routine instrumentation available in hospitals and research laboratories. © 2017 American Association for Clinical Chemistry.
Markides, Andreas; Skillman, Severin; Acton, Sahr Thomas; Elsaleh, Tarek; Hassanpour, Masoud; Ahrabian, Alireza; Kenny, Mark; Klein, Stuart; Rostill, Helen; Nilforooshan, Ramin; Barnaghi, Payam
2018-01-01
The number of people diagnosed with dementia is expected to rise in the coming years. Given that there is currently no definite cure for dementia and the cost of care for this condition soars dramatically, slowing the decline and maintaining independent living are important goals for supporting people with dementia. This paper discusses a study that is called Technology Integrated Health Management (TIHM). TIHM is a technology assisted monitoring system that uses Internet of Things (IoT) enabled solutions for continuous monitoring of people with dementia in their own homes. We have developed machine learning algorithms to analyse the correlation between environmental data collected by IoT technologies in TIHM in order to monitor and facilitate the physical well-being of people with dementia. The algorithms are developed with different temporal granularity to process the data for long-term and short-term analysis. We extract higher-level activity patterns which are then used to detect any change in patients’ routines. We have also developed a hierarchical information fusion approach for detecting agitation, irritability and aggression. We have conducted evaluations using sensory data collected from homes of people with dementia. The proposed techniques are able to recognise agitation and unusual patterns with an accuracy of up to 80%. PMID:29723236
Klemm, Matthias; Blum, Johannes; Link, Dietmar; Hammer, Martin; Haueisen, Jens; Schweitzer, Dietrich
2016-01-01
Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique to detect changes in the human retina. The autofluorescence decay over time, generated by endogenous fluorophores, is measured in vivo. The strong autofluorescence of the crystalline lens, however, superimposes the intensity decay of the retina fluorescence, as the confocal principle is not able to suppress it sufficiently. Thus, the crystalline lens autofluorescence causes artifacts in the retinal fluorescence lifetimes determined from the intensity decays. Here, we present a new technique to suppress the autofluorescence of the crystalline lens by introducing an annular stop into the detection light path, which we call Schweitzer’s principle. The efficacy of annular stops with an outer diameter of 7 mm and inner diameters of 1 to 5 mm are analyzed in an experimental setup using a model eye based on fluorescent dyes. Compared to the confocal principle, Schweitzer’s principle with an inner diameter of 3 mm is able to reduce the simulated crystalline lens fluorescence to 4%, while 42% of the simulated retina fluorescence is preserved. Thus, we recommend the implementation of Schweitzer’s principle in scanning laser ophthalmoscopes used for fundus autofluorescence measurements, especially the FLIO device, for improved image quality. PMID:27699092
NASA Astrophysics Data System (ADS)
Kannan, Rohit; Tangirala, Arun K.
2014-06-01
Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.
Enshaeifar, Shirin; Zoha, Ahmed; Markides, Andreas; Skillman, Severin; Acton, Sahr Thomas; Elsaleh, Tarek; Hassanpour, Masoud; Ahrabian, Alireza; Kenny, Mark; Klein, Stuart; Rostill, Helen; Nilforooshan, Ramin; Barnaghi, Payam
2018-01-01
The number of people diagnosed with dementia is expected to rise in the coming years. Given that there is currently no definite cure for dementia and the cost of care for this condition soars dramatically, slowing the decline and maintaining independent living are important goals for supporting people with dementia. This paper discusses a study that is called Technology Integrated Health Management (TIHM). TIHM is a technology assisted monitoring system that uses Internet of Things (IoT) enabled solutions for continuous monitoring of people with dementia in their own homes. We have developed machine learning algorithms to analyse the correlation between environmental data collected by IoT technologies in TIHM in order to monitor and facilitate the physical well-being of people with dementia. The algorithms are developed with different temporal granularity to process the data for long-term and short-term analysis. We extract higher-level activity patterns which are then used to detect any change in patients' routines. We have also developed a hierarchical information fusion approach for detecting agitation, irritability and aggression. We have conducted evaluations using sensory data collected from homes of people with dementia. The proposed techniques are able to recognise agitation and unusual patterns with an accuracy of up to 80%.
Screening mail for powders using terahertz technology
NASA Astrophysics Data System (ADS)
Kemp, Mike
2011-11-01
Following the 2001 Anthrax letter attacks in the USA, there has been a continuing interest in techniques that can detect or identify so-called 'white powder' concealed in envelopes. Electromagnetic waves (wavelengths 100-500 μm) in the terahertz frequency range penetrate paper and have short enough wavelengths to provide good resolution images; some materials also have spectroscopic signatures in the terahertz region. We report on an experimental study into the use of terahertz imaging and spectroscopy for mail screening. Spectroscopic signatures of target powders were measured and, using a specially designed test rig, a number of imaging methods based on reflection, transmission and scattering were investigated. It was found that, contrary to some previous reports, bacterial spores do not appear to have any strong spectroscopic signatures which would enable them to be identified. Imaging techniques based on reflection imaging and scattering are ineffective in this application, due to the similarities in optical properties between powders of interest and paper. However, transmission imaging using time-of-flight of terahertz pulses was found to be a very simple and sensitive method of detecting small quantities (25 mg) of powder, even in quite thick envelopes. An initial feasibility study indicates that this method could be used as the basis of a practical mail screening system.
Crossing Fibers Detection with an Analytical High Order Tensor Decomposition
Megherbi, T.; Kachouane, M.; Oulebsir-Boumghar, F.; Deriche, R.
2014-01-01
Diffusion magnetic resonance imaging (dMRI) is the only technique to probe in vivo and noninvasively the fiber structure of human brain white matter. Detecting the crossing of neuronal fibers remains an exciting challenge with an important impact in tractography. In this work, we tackle this challenging problem and propose an original and efficient technique to extract all crossing fibers from diffusion signals. To this end, we start by estimating, from the dMRI signal, the so-called Cartesian tensor fiber orientation distribution (CT-FOD) function, whose maxima correspond exactly to the orientations of the fibers. The fourth order symmetric positive definite tensor that represents the CT-FOD is then analytically decomposed via the application of a new theoretical approach and this decomposition is used to accurately extract all the fibers orientations. Our proposed high order tensor decomposition based approach is minimal and allows recovering the whole crossing fibers without any a priori information on the total number of fibers. Various experiments performed on noisy synthetic data, on phantom diffusion, data and on human brain data validate our approach and clearly demonstrate that it is efficient, robust to noise and performs favorably in terms of angular resolution and accuracy when compared to some classical and state-of-the-art approaches. PMID:25246940
A Study of Lane Detection Algorithm for Personal Vehicle
NASA Astrophysics Data System (ADS)
Kobayashi, Kazuyuki; Watanabe, Kajiro; Ohkubo, Tomoyuki; Kurihara, Yosuke
By the word “Personal vehicle”, we mean a simple and lightweight vehicle expected to emerge as personal ground transportation devices. The motorcycle, electric wheelchair, motor-powered bicycle, etc. are examples of the personal vehicle and have been developed as the useful for transportation for a personal use. Recently, a new types of intelligent personal vehicle called the Segway has been developed which is controlled and stabilized by using on-board intelligent multiple sensors. The demand for needs for such personal vehicles are increasing, 1) to enhance human mobility, 2) to support mobility for elderly person, 3) reduction of environmental burdens. Since rapidly growing personal vehicles' market, a number of accidents caused by human error is also increasing. The accidents are caused by it's drive ability. To enhance or support drive ability as well as to prevent accidents, intelligent assistance is necessary. One of most important elemental functions for personal vehicle is robust lane detection. In this paper, we develop a robust lane detection method for personal vehicle at outdoor environments. The proposed lane detection method employing a 360 degree omni directional camera and unique robust image processing algorithm. In order to detect lanes, combination of template matching technique and Hough transform are employed. The validity of proposed lane detection algorithm is confirmed by actual developed vehicle at various type of sunshined outdoor conditions.
Bat detective—Deep learning tools for bat acoustic signal detection
Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.
2018-01-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076
Blue-Whale Calls Detected at the Pioneer Seamount Underwater Observatory
NASA Astrophysics Data System (ADS)
Hoffman, M. D.; Vuosalo, C. O.; Bland, R. W.; Garfield, N.
2002-12-01
In September of 2001 a cabled vertical linear array (VLA) of hydrophones was deployed on Pioneer Seamount, 90 km off the California coast near Half Moon Bay, by the NOAA-PMEL and University of Washington-APL. The array of 4 hydrophones is at a depth of 950 m, and the four signals are digitized at the shore end of the cable at 1000 Hz. The data are archived by PMEL, and are available to the public over the internet. Spectrograms of all of the data are accessible on the SFSU web site. A large number of blue-whale calls are evident in the spectrograms. We have employed spectrogram correlation [Mellinger 2000] and a matched-filter detection scheme [Stafford 1998] to automatically identify these whale calls in three months of data. Results on the frequency of calls and their variability will be presented. Mellinger, David K., and Christopher W. Clark [2000], "Recognizing transient low-frequency whale sounds by spectrogram correlation," J. Acoust. Soc. Am. 107 (3518). Stafford, Kathleen M., Christopher G. Fox, and Davis S. Clark [1998], "Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean," J. Acoust. Soc. Am. 104 (3616).
Garnier, A; Poncet, F; Billette De Villemeur, A; Exbrayat, C; Bon, M F; Chevalier, A; Salicru, B; Tournegros, J M
2009-06-01
The screening program guidelines specify that the call back rate of women for additional imaging (positive mammogram) should not exceed 7% at initial screening, and 5% at subsequent screening. Materials and methods. Results in the Isere region (12%) have prompted a review of the correlation between the call back rate and indicators of quality (detection rate, sensitivity, specificity, positive predictive value) for the radiologists providing interpretations during that time period. Three groups of radiologists were identified: the group with call back rate of 10% achieved the best results (sensitivity: 92%, detection rate: 0.53%, specificity: 90%). The group with lowest call back rate (7.7%) showed insufficient sensitivity (58%). The last group with call back rate of 18.3%, showed no improvement in sensitivity (82%) and detection rate (0.53%), but showed reduced specificity (82%). The protocol update in 2001 does not resolve this problematic situation and national results continue to demonstrate a high percentage of positive screening mammograms. A significant increase in the number of positive screening examinations compared to recommended guidelines is not advantageous and leads to an overall decrease in the quality of the screening.
Fluorescence Characterization of Clinically-Important Bacteria
Dartnell, Lewis R.; Roberts, Tom A.; Moore, Ginny; Ward, John M.; Muller, Jan-Peter
2013-01-01
Healthcare-associated infections (HCAI/HAI) represent a substantial threat to patient health during hospitalization and incur billions of dollars additional cost for subsequent treatment. One promising method for the detection of bacterial contamination in a clinical setting before an HAI outbreak occurs is to exploit native fluorescence of cellular molecules for a hand-held, rapid-sweep surveillance instrument. Previous studies have shown fluorescence-based detection to be sensitive and effective for food-borne and environmental microorganisms, and even to be able to distinguish between cell types, but this powerful technique has not yet been deployed on the macroscale for the primary surveillance of contamination in healthcare facilities to prevent HAI. Here we report experimental data for the specification and design of such a fluorescence-based detection instrument. We have characterized the complete fluorescence response of eleven clinically-relevant bacteria by generating excitation-emission matrices (EEMs) over broad wavelength ranges. Furthermore, a number of surfaces and items of equipment commonly present on a ward, and potentially responsible for pathogen transfer, have been analyzed for potential issues of background fluorescence masking the signal from contaminant bacteria. These include bedside handrails, nurse call button, blood pressure cuff and ward computer keyboard, as well as disinfectant cleaning products and microfiber cloth. All examined bacterial strains exhibited a distinctive double-peak fluorescence feature associated with tryptophan with no other cellular fluorophore detected. Thus, this fluorescence survey found that an emission peak of 340nm, from an excitation source at 280nm, was the cellular fluorescence signal to target for detection of bacterial contamination. The majority of materials analysed offer a spectral window through which bacterial contamination could indeed be detected. A few instances were found of potential problems of background fluorescence masking that of bacteria, but in the case of the microfiber cleaning cloth, imaging techniques could morphologically distinguish between stray strands and bacterial contamination. PMID:24098687
Fluorescence characterization of clinically-important bacteria.
Dartnell, Lewis R; Roberts, Tom A; Moore, Ginny; Ward, John M; Muller, Jan-Peter
2013-01-01
Healthcare-associated infections (HCAI/HAI) represent a substantial threat to patient health during hospitalization and incur billions of dollars additional cost for subsequent treatment. One promising method for the detection of bacterial contamination in a clinical setting before an HAI outbreak occurs is to exploit native fluorescence of cellular molecules for a hand-held, rapid-sweep surveillance instrument. Previous studies have shown fluorescence-based detection to be sensitive and effective for food-borne and environmental microorganisms, and even to be able to distinguish between cell types, but this powerful technique has not yet been deployed on the macroscale for the primary surveillance of contamination in healthcare facilities to prevent HAI. Here we report experimental data for the specification and design of such a fluorescence-based detection instrument. We have characterized the complete fluorescence response of eleven clinically-relevant bacteria by generating excitation-emission matrices (EEMs) over broad wavelength ranges. Furthermore, a number of surfaces and items of equipment commonly present on a ward, and potentially responsible for pathogen transfer, have been analyzed for potential issues of background fluorescence masking the signal from contaminant bacteria. These include bedside handrails, nurse call button, blood pressure cuff and ward computer keyboard, as well as disinfectant cleaning products and microfiber cloth. All examined bacterial strains exhibited a distinctive double-peak fluorescence feature associated with tryptophan with no other cellular fluorophore detected. Thus, this fluorescence survey found that an emission peak of 340nm, from an excitation source at 280nm, was the cellular fluorescence signal to target for detection of bacterial contamination. The majority of materials analysed offer a spectral window through which bacterial contamination could indeed be detected. A few instances were found of potential problems of background fluorescence masking that of bacteria, but in the case of the microfiber cleaning cloth, imaging techniques could morphologically distinguish between stray strands and bacterial contamination.
NASA Astrophysics Data System (ADS)
Yang, G.; Lin, Y.; Bhattacharya, P.
2007-12-01
To achieve an effective and safe operation on the machine system where the human interacts with the machine mutually, there is a need for the machine to understand the human state, especially cognitive state, when the human's operation task demands an intensive cognitive activity. Due to a well-known fact with the human being, a highly uncertain cognitive state and behavior as well as expressions or cues, the recent trend to infer the human state is to consider multimodality features of the human operator. In this paper, we present a method for multimodality inferring of human cognitive states by integrating neuro-fuzzy network and information fusion techniques. To demonstrate the effectiveness of this method, we take the driver fatigue detection as an example. The proposed method has, in particular, the following new features. First, human expressions are classified into four categories: (i) casual or contextual feature, (ii) contact feature, (iii) contactless feature, and (iv) performance feature. Second, the fuzzy neural network technique, in particular Takagi-Sugeno-Kang (TSK) model, is employed to cope with uncertain behaviors. Third, the sensor fusion technique, in particular ordered weighted aggregation (OWA), is integrated with the TSK model in such a way that cues are taken as inputs to the TSK model, and then the outputs of the TSK are fused by the OWA which gives outputs corresponding to particular cognitive states under interest (e.g., fatigue). We call this method TSK-OWA. Validation of the TSK-OWA, performed in the Northeastern University vehicle drive simulator, has shown that the proposed method is promising to be a general tool for human cognitive state inferring and a special tool for the driver fatigue detection.
Defending networks against denial-of-service attacks
NASA Astrophysics Data System (ADS)
Gelenbe, Erol; Gellman, Michael; Loukas, George
2004-11-01
Denial of service attacks, viruses and worms are common tools for malicious adversarial behavior in networks. Experience shows that over the last few years several of these techniques have probably been used by governments to impair the Internet communications of various entities, and we can expect that these and other information warfare tools will be used increasingly as part of hostile behavior either independently, or in conjunction with other forms of attack in conventional or asymmetric warfare, as well as in other forms of malicious behavior. In this paper we concentrate on Distributed Denial of Service Attacks (DDoS) where one or more attackers generate flooding traffic and direct it from multiple sources towards a set of selected nodes or IP addresses in the Internet. We first briefly survey the literature on the subject, and discuss some examples of DDoS incidents. We then present a technique that can be used for DDoS protection based on creating islands of protection around a critical information infrastructure. This technique, that we call the CPN-DoS-DT (Cognitive Packet Networks DoS Defence Technique), creates a self-monitoring sub-network surrounding each critical infrastructure node. CPN-DoS-DT is triggered by a DDoS detection scheme, and generates control traffic from the objects of the DDoS attack to the islands of protection where DDOS packet flows are destroyed before they reach the critical infrastructure. We use mathematical modelling, simulation and experiments on our test-bed to show the positive and negative outcomes that may result from both the attack, and the CPN-DoS-DT protection mechanism, due to imperfect detection and false alarms.
White, Nathan S.; McDonald, Carrie; Farid, Niky; Kuperman, Josh; Karow, David; Schenker-Ahmed, Natalie M.; Bartsch, Hauke; Rakow-Penner, Rebecca; Holland, Dominic; Shabaik, Ahmed; Bjørnerud, Atle; Hope, Tuva; Hattangadi-Gluth, Jona; Liss, Michael; Parsons, J. Kellogg; Chen, Clark C.; Raman, Steve; Margolis, Daniel; Reiter, Robert E.; Marks, Leonard; Kesari, Santosh; Mundt, Arno J.; Kane, Chris J.; Carter, Bob S.; Bradley, William G.; Dale, Anders M.
2014-01-01
Diffusion weighted imaging (DWI) has been at the forefront of cancer imaging since the early 2000’s. Prior to its application in clinical oncology, this powerful technique had already achieved widespread recognition due to its utility in the diagnosis of cerebral infarction. Following this initial success, the ability of DWI to detect inherent tissue contrast began to be exploited in the field of oncology. Although the initial oncologic applications for tumor detection and characterization, assessing treatment response, and predicting survival were primarily in the field of neuro-oncology, the scope of DWI has since broadened to include oncologic imaging of the prostate gland, breast, and liver. Despite its growing success and application, misconceptions as to the underlying physical basis of the DWI signal exist among researchers and clinicians alike. In this review, we provide a detailed explanation of the biophysical basis of diffusion contrast, emphasizing the difference between hindered and restricted diffusion, and elucidating how diffusion parameters in tissue are derived from the measurements via the diffusion model. We describe one advanced DWI modeling technique, called Restriction Spectrum Imaging (RSI). This technique offers a more direct in vivo measure of tumor cells, due to its ability to distinguish separable pools of water within tissue based on their intrinsic diffusion characteristics. Using RSI as an example, we then highlight the ability of advanced DWI techniques to address key clinical challenges in neuro-oncology, including improved tumor conspicuity, distinguishing actual response to therapy from pseudoresponse, and delineation of white matter tracts in regions of peritumoral edema. We also discuss how RSI, combined with new methods for correction of spatial distortions inherent diffusion MRI scans, may enable more precise spatial targeting of lesions, with implications for radiation oncology, and surgical planning. PMID:25183788
A comparison in Colorado of three methods to monitor breeding amphibians
Corn, P.S.; Muths, E.; Iko, W.M.
2000-01-01
We surveyed amphibians at 4 montane and 2 plains lentic sites in northern Colorado using 3 techniques: standardized call surveys, automated recording devices (frog-loggers), and intensive surveys including capture-recapture techniques. Amphibians were observed at 5 sites. Species richness varied from 0 to 4 species at each site. Richness scores, the sums of species richness among sites, were similar among methods: 8 for call surveys, 10 for frog-loggers, and 11 for intensive surveys (9 if the non-vocal salamander Ambystoma tigrinum is excluded). The frog-logger at 1 site recorded Spea bombifrons which was not active during the times when call and intensive surveys were conducted. Relative abundance scores from call surveys failed to reflect a relatively large population of Bufo woodhousii at 1 site and only weakly differentiated among different-sized populations of Pseudacris maculata at 3 other sites. For extensive applications, call surveys have the lowest costs and fewest requirements for highly trained personnel. However, for a variety of reasons, call surveys cannot be used with equal effectiveness in all parts of North America.
NASA Astrophysics Data System (ADS)
Salehi, Mohammad; Schneider, Lilli; Ströbel, Philipp; Marx, Alexander; Packeisen, Jens; Schlücker, Sebastian
2014-01-01
SERS microscopy is a novel staining technique in immunohistochemistry, which is based on antibodies labeled with functionalized noble metal colloids called SERS labels or nanotags for optical detection. Conventional covalent bioconjugation of these SERS labels cannot prevent blocking of the antigen recognition sites of the antibody. We present a rational chemical design for SERS label-antibody conjugates which addresses this issue. Highly sensitive, silica-coated gold nanoparticle clusters as SERS labels are non-covalently conjugated to primary antibodies by using the chimeric protein A/G, which selectively recognizes the Fc part of antibodies and therefore prevents blocking of the antigen recognition sites. In proof-of-concept two-color imaging experiments for the co-localization of p63 and PSA on non-neoplastic prostate tissue FFPE specimens, we demonstrate the specificity and signal brightness of these rationally designed primary antibody-protein A/G-gold nanocluster conjugates.SERS microscopy is a novel staining technique in immunohistochemistry, which is based on antibodies labeled with functionalized noble metal colloids called SERS labels or nanotags for optical detection. Conventional covalent bioconjugation of these SERS labels cannot prevent blocking of the antigen recognition sites of the antibody. We present a rational chemical design for SERS label-antibody conjugates which addresses this issue. Highly sensitive, silica-coated gold nanoparticle clusters as SERS labels are non-covalently conjugated to primary antibodies by using the chimeric protein A/G, which selectively recognizes the Fc part of antibodies and therefore prevents blocking of the antigen recognition sites. In proof-of-concept two-color imaging experiments for the co-localization of p63 and PSA on non-neoplastic prostate tissue FFPE specimens, we demonstrate the specificity and signal brightness of these rationally designed primary antibody-protein A/G-gold nanocluster conjugates. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr05890e
NASA Astrophysics Data System (ADS)
Berchok, Catherine L.
During four field seasons from 1998--2001, 115 hours of acoustic recordings were made in the presence of the well-studied St. Lawrence population of blue whales. The primary field site for this study was the estuary region of the St. Lawrence River (Quebec, Canada) with most recordings made between mid-August and late October. Effort was concentrated in the daylight hours, although occasionally extending past nightfall. An inexpensive and portable recording system was built that was easy to deploy and provided quality recordings in a variety of sea conditions. It consisted of a calibrated omni-directional hydrophone with a flat (+/-3dB) response from 5Hz to 800Hz; and a surface isolation buoy to minimize the vertical movement of the sensor. During the recording sessions detailed field notes were taken on all blue whales within sight, with individual identities confirmed through photo-identification work between sessions. Notes were also taken on all other species sighted during the recording sessions. Characterization of the more than one-thousand blue whale calls detected during this study revealed that the St. Lawrence repertoire is much more extensive than previously reported. Three infrasonic (<20Hz) and four audible range (30--200Hz) call types were detected in this study, with much time/frequency variation seen within each type. The infrasonic calls were long (5--30s) in duration and arranged into regularly patterned series. These calls were similar in call characteristics and spacing to those detected in the North Atlantic, but had much shorter and more variable patterned series. The audible call types were much shorter (1--4s), and occurred singly or in irregularly spaced clusters, although a special patterning was seen that contained both regular and irregular spaced components. Comparison of the daily, seasonal, and spatial distributions of calling behavior with those of several biological parameters revealed interesting differences between the three call types examined. The trends seen suggest a migratory, reproductive, or foraging context for the infrasonic calls. A closer-range social context is suggested for the audible downsweeps, which have been detected in foraging situations as well as in courtship displays. The audible mixed-pattern call type appears to have a primarily reproductive context.
NASA Astrophysics Data System (ADS)
Markus, Charles R.; McCollum, Jefferson E.; Hodges, James Neil; Perry, Adam J.; McCall, Benjamin J.
2017-06-01
Molecular ions are challenging to study with conventional spectroscopic methods. Laboratory discharges produce ions in trace quantities which can be obscured by the abundant neutral molecules present. The technique Noise Immune Cavity Enhanced Optical Heterodyne Velocity Modulation Spectroscopy (NICE-OHVMS) overcomes these challenges by combining the ion-neutral discrimination of velocity modulation spectroscopy with the sensitivity of Noise-Immune Cavity-Enhanced Optical Heterodyne Molecular Spectroscopy (NICE-OHMS), and has been able to determine transition frequencies of molecular ions in the mid-infrared (mid-IR) with sub-MHz uncertainties when calibrated with an optical frequency comb. However, the extent of these studies was limited by the presence of fringes due to parasitic etalons and the speed and noise characteristics of mid-IR detectors. Recently, we have overcome these limitations by implementing up-conversion detection and dithered optics. We performed up-conversion using periodically poled lithium niobate to convert light from the mid-IR to the visible to be within the coverage of sensitive and fast silicon detectors while maintaining our heterodyne and velocity modulation signals. The parasitic etalons were removed by rapidly rotating CaF_2 windows with galvanometers, which is known as a Brewster-plate spoiler, which averaged out the fringes in detection. Together, these improved the sensitivity by more than an order of magnitude and have enabled extended spectroscopic surveys of molecular ions in the mid-IR. J. N. Hodges, A. J. Perry, P. A. Jenkins II, B. M. Siller, and B. J. McCall, J. Chem. Phys. (2013), 139, 164201. C. R. Webster, J. Opt. Soc. Am. B (1985), 2, 1464. C. R. Markus, A. J. Perry, J. N. Hodges, and B. J. McCall, Opt. Express (2017), 25, 3709-3721.
Planetary mass function and planetary systems
NASA Astrophysics Data System (ADS)
Dominik, M.
2011-02-01
With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.
NASA Astrophysics Data System (ADS)
Cahoy, Kerri; Fischer, Debra; Spronck, Julien; DeMille, David
2010-07-01
Exoplanets can be detected from a time series of stellar spectra by looking for small, periodic shifts in the absorption features that are consistent with Doppler shifts caused by the presence of an exoplanet, or multiple exoplanets, in the system. While hundreds of large exoplanets have already been discovered with the Doppler technique (also called radial velocity), our goal is to improve the measurement precision so that many Earth-like planets can be detected. The smaller mass and longer period of true Earth analogues require the ability to detect a reflex velocity of ~10 cm/s over long time periods. Currently, typical astronomical spectrographs calibrate using either Iodine absorptive cells or Thorium Argon lamps and achieve ~10 m/s precision, with the most stable spectrographs pushing down to ~2 m/s. High velocity precision is currently achieved at HARPS by controlling the thermal and pressure environment of the spectrograph. These environmental controls increase the cost of the spectrograph, and it is not feasible to simply retrofit existing spectrometers. We propose a fiber-fed high precision spectrograph design that combines the existing ~5000-6000 A Iodine calibration system with a high-precision Laser Frequency Comb (LFC) system from ~6000-7000 A that just meets the redward side of the Iodine lines. The scientific motivation for such a system includes: a 1000 A span in the red is currently achievable with LFC systems, combining the two calibration methods increases the wavelength range by a factor of two, and moving redward decreases the "noise" from starspots. The proposed LFC system design employs a fiber laser, tunable serial Fabry-Perot cavity filters to match the resolution of the LFC system to that of standard astronomical spectrographs, and terminal ultrasonic vibration of the multimode fiber for a stable point spread function.
Esfandyarpour, Rahim; Esfandyarpour, Hesaam; Harris, James S; Davis, Ronald W
2013-11-22
Biosensors are used for the detection of biochemical molecules such as proteins and nucleic acids. Traditional techniques, such as enzyme-linked immuno-sorbent assay (ELISA), are sensitive but require several hours to yield a result and usually require the attachment of a fluorophore molecule to the target molecule. Micromachined biosensors that employ electrical detection are now being developed. Here we describe one such device, which is ultrasensitive, real-time, label free and localized. It is called the nanoneedle biosensor and shows promise to overcome some of the current limitations of biosensors. The key element of this device is a 10 nm wide annular gap at the end of the needle, which is the sensitive part of the sensor. The total diameter of the sensor is about 100 nm. Any change in the population of molecules in this gap results in a change of impedance across the gap. Single molecule detection should be possible because the sensory part of the sensor is in the range of bio-molecules of interest. To increase throughput we can flow the solution containing the target molecules over an array of such structures, each with its own integrated read-out circuitry to allow 'real-time' detection (i.e. several minutes) of label free molecules without sacrificing sensitivity. To fabricate the arrays we used electron beam lithography together with associated pattern transfer techniques. Preliminary measurements on individual needle structures in water are consistent with the design. Since the proposed sensor has a rigid nano-structure, this technology, once fully developed, could ultimately be used to directly monitor protein quantities within a single living cell, an application that would have significant utility for drug screening and studying various intracellular signaling pathways.
NASA Astrophysics Data System (ADS)
Esfandyarpour, Rahim; Esfandyarpour, Hesaam; Harris, James S.; Davis, Ronald W.
2013-11-01
Biosensors are used for the detection of biochemical molecules such as proteins and nucleic acids. Traditional techniques, such as enzyme-linked immuno-sorbent assay (ELISA), are sensitive but require several hours to yield a result and usually require the attachment of a fluorophore molecule to the target molecule. Micromachined biosensors that employ electrical detection are now being developed. Here we describe one such device, which is ultrasensitive, real-time, label free and localized. It is called the nanoneedle biosensor and shows promise to overcome some of the current limitations of biosensors. The key element of this device is a 10 nm wide annular gap at the end of the needle, which is the sensitive part of the sensor. The total diameter of the sensor is about 100 nm. Any change in the population of molecules in this gap results in a change of impedance across the gap. Single molecule detection should be possible because the sensory part of the sensor is in the range of bio-molecules of interest. To increase throughput we can flow the solution containing the target molecules over an array of such structures, each with its own integrated read-out circuitry to allow ‘real-time’ detection (i.e. several minutes) of label free molecules without sacrificing sensitivity. To fabricate the arrays we used electron beam lithography together with associated pattern transfer techniques. Preliminary measurements on individual needle structures in water are consistent with the design. Since the proposed sensor has a rigid nano-structure, this technology, once fully developed, could ultimately be used to directly monitor protein quantities within a single living cell, an application that would have significant utility for drug screening and studying various intracellular signaling pathways.
Yeo, Zhen Xuan; Wong, Joshua Chee Leong; Rozen, Steven G; Lee, Ann Siew Gek
2014-06-24
The Ion Torrent PGM is a popular benchtop sequencer that shows promise in replacing conventional Sanger sequencing as the gold standard for mutation detection. Despite the PGM's reported high accuracy in calling single nucleotide variations, it tends to generate many false positive calls in detecting insertions and deletions (indels), which may hinder its utility for clinical genetic testing. Recently, the proprietary analytical workflow for the Ion Torrent sequencer, Torrent Suite (TS), underwent a series of upgrades. We evaluated three major upgrades of TS by calling indels in the BRCA1 and BRCA2 genes. Our analysis revealed that false negative indels could be generated by TS under both default calling parameters and parameters adjusted for maximum sensitivity. However, indel calling with the same data using the open source variant callers, GATK and SAMtools showed that false negatives could be minimised with the use of appropriate bioinformatics analysis. Furthermore, we identified two variant calling measures, Quality-by-Depth (QD) and VARiation of the Width of gaps and inserts (VARW), which substantially reduced false positive indels, including non-homopolymer associated errors without compromising sensitivity. In our best case scenario that involved the TMAP aligner and SAMtools, we achieved 100% sensitivity, 99.99% specificity and 29% False Discovery Rate (FDR) in indel calling from all 23 samples, which is a good performance for mutation screening using PGM. New versions of TS, BWA and GATK have shown improvements in indel calling sensitivity and specificity over their older counterpart. However, the variant caller of TS exhibits a lower sensitivity than GATK and SAMtools. Our findings demonstrate that although indel calling from PGM sequences may appear to be noisy at first glance, proper computational indel calling analysis is able to maximize both the sensitivity and specificity at the single base level, paving the way for the usage of this technology for future clinical genetic testing.
A 3D Model for Eddy Current Inspection in Aeronautics: Application to Riveted Structures
NASA Astrophysics Data System (ADS)
Paillard, S.; Pichenot, G.; Lambert, M.; Voillaume, H.; Dominguez, N.
2007-03-01
Eddy current technique is currently an operational tool used for fastener inspection which is an important issue for the maintenance of aircraft structures. The industry calls for faster, more sensitive and reliable NDT techniques for the detection and characterization of potential flaws nearby rivet. In order to reduce the development time and to optimize the design and the performances assessment of an inspection procedure, the CEA and EADS have started a collaborative work aiming at extending the modeling features of the CIVA non destructive simulation plat-form in order to handle the configuration of a layered planar structure with a rivet and an embedded flaw nearby. Therefore, an approach based on the Volume Integral Method using the Green dyadic formalism which greatly increases computation efficiency has been developed. The first step, modeling the rivet without flaw as a hole in a multi-stratified structure, has been reached and validated in several configurations with experimental data.
NASA Astrophysics Data System (ADS)
Wiebe, S.; Rhoades, G.; Wei, Z.; Rosenberg, A.; Belev, G.; Chapman, D.
2013-05-01
Refraction x-ray contrast is an imaging modality used primarily in a research setting at synchrotron facilities, which have a biomedical imaging research program. The most common method for exploiting refraction contrast is by using a technique called Diffraction Enhanced Imaging (DEI). The DEI apparatus allows the detection of refraction between two materials and produces a unique ''edge enhanced'' contrast appearance, very different from the traditional absorption x-ray imaging used in clinical radiology. In this paper we aim to explain the features of x-ray refraction contrast as a typical clinical radiologist would understand. Then a discussion regarding what needs to be considered in the interpretation of the refraction image takes place. Finally we present a discussion about the limitations of planar refraction imaging and the potential of DEI Computed Tomography. This is an original work that has not been submitted to any other source for publication. The authors have no commercial interests or conflicts of interest to disclose.
Nadeau, C.P.; Conway, C.J.; Smith, B.S.; Lewis, T.E.
2008-01-01
We conducted 262 call-broadcast point-count surveys (1-6 replicate surveys on each of 62 points) using standardized North American Marsh Bird Monitoring Protocols between 31 May and 7 July 2006 on St. Vincent National Wildlife Refuge, an island off the northwest coast of Florida. We conducted double-blind multiple-observer surveys, paired morning and evening surveys, and paired morning and night surveys to examine the influence of call-broadcast and time of day on detection probability. Observer detection probability for all species pooled was 75% and was similar between passive (69%) and call-broadcast (65%) periods. Detection probability was higher on morning than evening (t = 3.0, P = 0.030) or night (t = 3.4, P = 0.042) surveys when we pooled all species. Detection probability was higher (but not significant for all species) on morning compared to evening or night surveys for all five focal species detected on surveys: Least Bittern (Ixobrychus exilis), Clapper Rail (Rallus longirostris), Purple Gallinule (Porphyrula martinica), Common Moorhen (Gallinula chloropus), and American Coot (Fulica americana). We detected more Least Bitterns (t = 2.4, P = 0.064) and Common Moorhens (t = 2.8, P = 0.026) on morning than evening surveys, and more Clapper Rails (t = 5.1, P = 0.014) on morning than night surveys.
Automated surveillance of 911 call data for detection of possible water contamination incidents.
Haas, Adam J; Gibbons, Darcy; Dangel, Chrissy; Allgeier, Steve
2011-03-30
Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event.
Automated surveillance of 911 call data for detection of possible water contamination incidents
2011-01-01
Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105
Damage detection of an in-service condensation pipeline joint
NASA Astrophysics Data System (ADS)
Briand, Julie; Rezaei, Davood; Taheri, Farid
2010-04-01
The early detection of damage in structural or mechanical systems is of vital importance. With early detection, the damage may be repaired before the integrity of the system is jeopardized, resulting in monetary losses, loss of life or limb, and environmental impacts. Among the various types of structural health monitoring techniques, vibration-based methods are of significant interest since the damage location does not need to be known beforehand, making it a more versatile approach. The non-destructive damage detection method used for the experiments herein is a novel vibration-based method which uses an index called the EMD Energy Damage Index, developed with the aim of providing improved qualitative results compared to those methods currently available. As part of an effort to establish the integrity and limitation of this novel damage detection method, field testing was completed on a mechanical pipe joint on a condensation line, located in the physical plant of Dalhousie University. Piezoceramic sensors, placed at various locations around the joint were used to monitor the free vibration of the pipe imposed through the use of an impulse hammer. Multiple damage progression scenarios were completed, each having a healthy state and multiple damage cases. Subsequently, the recorded signals from the healthy and damaged joint were processed through the EMD Energy Damage Index developed in-house in an effort to detect the inflicted damage. The proposed methodology successfully detected the inflicted damages. In this paper, the effects of impact location, sensor location, frequency bandwidth, intrinsic mode functions, and boundary conditions are discussed.
Improved Spectroscopy of Molecular Ions in the Mid-Infrared with Up-Conversion Detection
NASA Astrophysics Data System (ADS)
Markus, Charles R.; Perry, Adam J.; Hodges, James N.; McCall, Benjamin J.
2016-06-01
Heterodyne detection, velocity modulation, and cavity enhancement are useful tools for observing rovibrational transitions of important molecular ions. We have utilized these methods to investigate a number of molecular ions, such as H_3^+, CH_5^+, HeH^+, and OH^+. In the past, parasitic etalons and the lack of fast and sensitive detectors in the mid-infrared have limited the number of transitions we could measure with MHz-level precision. Recently, we have significantly reduced the amplitude of unwanted interference fringes with a Brewster-plate spoiler. We have also developed a detection scheme which up-converts the mid-infrared light with difference frequency generation which allows the use of a faster and more sensitive avalanche photodetector. The higher detection bandwidth allows for optimized heterodyne detection at higher modulation frequencies. The overall gain in signal-to-noise from both improvements will enable extensive high-precision line lists of molecular ions and searches for previously unobserved transitions. K.N. Crabtree, J.N. Hodges, B.M. Siller, A.J. Perry, J.E. Kelly, P.A. Jenkins II, and B.J. McCall, Chem. Phys. Lett. 551 (2012) 1-6. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall, J. Mol. Spec. 317 (2015) 71-73. J.N. Hodges, A.J. Perry, P.A. Jenkins II, B.M. Siller, and B.J. McCall, J. Chem. Phys. 139 (2013) 164291. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall. 2014, J. Chem. Phys. 141, 101101 C.R. Markus, J.N. Hodges, A.J. Perry, G.S. Kocheril, H.S.P. Muller, and B.J. McCall, Astrophys. J. 817 (2016) 138.
Ontology Alignment Repair through Modularization and Confidence-Based Heuristics
Santos, Emanuel; Faria, Daniel; Pesquita, Catia; Couto, Francisco M.
2015-01-01
Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system. PMID:26710335
Ontology Alignment Repair through Modularization and Confidence-Based Heuristics.
Santos, Emanuel; Faria, Daniel; Pesquita, Catia; Couto, Francisco M
2015-01-01
Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system.
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2002-01-01
Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.
Leveraging disjoint communities for detecting overlapping community structure
NASA Astrophysics Data System (ADS)
Chakraborty, Tanmoy
2015-05-01
Network communities represent mesoscopic structure for understanding the organization of real-world networks, where nodes often belong to multiple communities and form overlapping community structure in the network. Due to non-triviality in finding the exact boundary of such overlapping communities, this problem has become challenging, and therefore huge effort has been devoted to detect overlapping communities from the network. In this paper, we present PVOC (Permanence based Vertex-replication algorithm for Overlapping Community detection), a two-stage framework to detect overlapping community structure. We build on a novel observation that non-overlapping community structure detected by a standard disjoint community detection algorithm from a network has high resemblance with its actual overlapping community structure, except the overlapping part. Based on this observation, we posit that there is perhaps no need of building yet another overlapping community finding algorithm; but one can efficiently manipulate the output of any existing disjoint community finding algorithm to obtain the required overlapping structure. We propose a new post-processing technique that by combining with any existing disjoint community detection algorithm, can suitably process each vertex using a new vertex-based metric, called permanence, and thereby finds out overlapping candidates with their community memberships. Experimental results on both synthetic and large real-world networks show that PVOC significantly outperforms six state-of-the-art overlapping community detection algorithms in terms of high similarity of the output with the ground-truth structure. Thus our framework not only finds meaningful overlapping communities from the network, but also allows us to put an end to the constant effort of building yet another overlapping community detection algorithm.
First Polarized Power Spectra from HERA-19 Commissioning Data: Comparison with Simulations
NASA Astrophysics Data System (ADS)
Igarashi, Amy; Chichura, Paul; Fox Fortino, Austin; Kohn, Saul; Aguirre, James; HERA Collaboration, CHAMP
2018-01-01
The Hydrogen Epoch of Reionization Array (HERA) is a radio telescope whose primary goal is the detection of redshifted 21-cm line radiation produced from the spin-flip transition of HI during the Epoch of Reionization (EoR). HERA is currently under construction in South Africa, and will eventually be an array of 350 14-m antennas. HERA aims for a statistical detection of the power spectrum of this emission, using the so-called delay spectrum technique (Parsons et al 2012). We examine a first season of commissioning data from the first 19 elements (HERA-19) to characterize Galactic and extragalactic foregrounds. We compare the delay spectrum for HERA-19 constructed from data to those constructed from simulations done using a detailed instrument electromagnetic model and using the unpolarized Global Sky Model (GSM2008). We compare the data and simulations to explore the effects of Stokes-I to Q and U leakage, and further examine whether statistical models of polarization match the observed polarized power spectra.
SSME fault monitoring and diagnosis expert system
NASA Technical Reports Server (NTRS)
Ali, Moonis; Norman, Arnold M.; Gupta, U. K.
1989-01-01
An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.
Bi-model processing for early detection of breast tumor in CAD system
NASA Astrophysics Data System (ADS)
Mughal, Bushra; Sharif, Muhammad; Muhammad, Nazeer
2017-06-01
Early screening of skeptical masses in mammograms may reduce mortality rate among women. This rate can be further reduced upon developing the computer-aided diagnosis system with decrease in false assumptions in medical informatics. This method highlights the early tumor detection in digitized mammograms. For improving the performance of this system, a novel bi-model processing algorithm is introduced. It divides the region of interest into two parts, the first one is called pre-segmented region (breast parenchyma) and other is the post-segmented region (suspicious region). This system follows the scheme of the preprocessing technique of contrast enhancement that can be utilized to segment and extract the desired feature of the given mammogram. In the next phase, a hybrid feature block is presented to show the effective performance of computer-aided diagnosis. In order to assess the effectiveness of the proposed method, a database provided by the society of mammographic images is tested. Our experimental outcomes on this database exhibit the usefulness and robustness of the proposed method.
An inequality for detecting financial fraud, derived from the Markowitz Optimal Portfolio Theory
NASA Astrophysics Data System (ADS)
Bard, Gregory V.
2016-12-01
The Markowitz Optimal Portfolio Theory, published in 1952, is well-known, and was often taught because it blends Lagrange Multipliers, matrices, statistics, and mathematical finance. However, the theory faded from prominence in American investing, as Business departments at US universities shifted from techniques based on mathematics, finance, and statistics, to focus instead on leadership, public speaking, interpersonal skills, advertising, etc… The author proposes a new application of Markowitz's Theory: the detection of a fairly broad category of financial fraud (called "Ponzi schemes" in American newspapers) by looking at a particular inequality derived from the Markowitz Optimal Portfolio Theory, relating volatility and expected rate of return. For example, one recent Ponzi scheme was that of Bernard Madoff, uncovered in December 2008, which comprised fraud totaling 64,800,000,000 US dollars [23]. The objective is to compare investments with the "efficient frontier" as predicted by Markowitz's theory. Violations of the inequality should be impossible in theory; therefore, in practice, violations might indicate fraud.
Munding, J; Ziebarth, W; Belyaev, O; Uhl, W; Tannapfel, A
2013-12-01
The surveillance of patients with Barrett mucosa in the distal oesophagus leads to an increase of patients diagnosed with early cancer of the oesophagogastric junction and stomach with only superficial infiltration. Comparable to Asian countries where screening of patients at risk is recommended due to the high incidence of gastric cancer, endoscopic resection of early cancer in the stomach and distal oesophagus is increasing. In spite of the special endoscopic techniques--there are several requirements for the resected specimen which ensure its exact pathohistological evaluation. This is necessary to detect the exact depth of infiltration and the resection margins. To provide an exact pathohistological diagnosis is important for further therapeutic implications and prognosis. Advanced carcinomas of the oesophagus and stomach need multimodal treatment with radiation and chemotherapy. This has a special impact on the tumour which leads to pathohistological detectable changes as estimated in the so-called regression grading. Georg Thieme Verlag KG Stuttgart · New York.
Document segmentation for high-quality printing
NASA Astrophysics Data System (ADS)
Ancin, Hakan
1997-04-01
A technique to segment dark texts on light background of mixed mode color documents is presented. This process does not perceptually change graphics and photo regions. Color documents are scanned and printed from various media which usually do not have clean background. This is especially the case for the printouts generated from thin magazine samples, these printouts usually include text and figures form the back of the page, which is called bleeding. Removal of bleeding artifacts improves the perceptual quality of the printed document and reduces the color ink usage. By detecting the light background of the document, these artifacts are removed from background regions. Also detection of dark text regions enables the halftoning algorithms to use true black ink for the black text pixels instead of composite black. The processed document contains sharp black text on white background, resulting improved perceptual quality and better ink utilization. The described method is memory efficient and requires a small number of scan lines of high resolution color documents during processing.
Comparative Study of Speckle Filtering Methods in PolSAR Radar Images
NASA Astrophysics Data System (ADS)
Boutarfa, S.; Bouchemakh, L.; Smara, Y.
2015-04-01
Images acquired by polarimetric SAR (PolSAR) radar systems are characterized by the presence of a noise called speckle. This noise has a multiplicative nature, corrupts both the amplitude and phase images, which complicates data interpretation, degrades segmentation performance and reduces the detectability of targets. Hence, the need to preprocess the images by adapted filtering methods before analysis.In this paper, we present a comparative study of implemented methods for reducing speckle in PolSAR images. These developed filters are: refined Lee filter based on the estimation of the minimum mean square error MMSE, improved Sigma filter with detection of strong scatterers based on the calculation of the coherency matrix to detect the different scatterers in order to preserve the polarization signature and maintain structures that are necessary for image interpretation, filtering by stationary wavelet transform SWT using multi-scale edge detection and the technique for improving the wavelet coefficients called SSC (sum of squared coefficients), and Turbo filter which is a combination between two complementary filters the refined Lee filter and the wavelet transform SWT. One filter can boost up the results of the other.The originality of our work is based on the application of these methods to several types of images: amplitude, intensity and complex, from a satellite or an airborne radar, and on the optimization of wavelet filtering by adding a parameter in the calculation of the threshold. This parameter will control the filtering effect and get a good compromise between smoothing homogeneous areas and preserving linear structures.The methods are applied to the fully polarimetric RADARSAT-2 images (HH, HV, VH, VV) acquired on Algiers, Algeria, in C-band and to the three polarimetric E-SAR images (HH, HV, VV) acquired on Oberpfaffenhofen area located in Munich, Germany, in P-band.To evaluate the performance of each filter, we used the following criteria: smoothing homogeneous areas, preserving edges and polarimetric information.Experimental results are included to illustrate the different implemented methods.
Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911.
Sharkey, Sonya; Denke, Linda; Herbert, Morley A
2016-08-01
To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A pretest and three posttests-one immediately following the presentation, one at 3 months, and a third at 6 months-were administered. Responses from 282 students were evaluable. Significant improvements (p < .001) in knowledge were found through all posttests in identifying what parts of the body stroke affected and through the first two posttests in recognizing symptoms stroke victims experienced. Students demonstrated at pretest a high awareness of EMS and 911 (97.5%) and showed slight, but not significant, improvement over time. © The Author(s) 2016.
Multidisciplinary Responses to the Sexual Victimization of Children: Use of Control Phone Calls.
Canavan, J William; Borowski, Christine; Essex, Stacy; Perkowski, Stefan
2017-10-01
This descriptive study addresses the question of the value of one-party consent phone calls regarding the sexual victimization of children. The authors reviewed 4 years of experience with children between the ages of 3 and 18 years selected for the control phone calls after a forensic interview by the New York State Police forensic interviewer. The forensic interviewer identified appropriate cases for control phone calls considering New York State law, the child's capacity to make the call, the presence of another person to make the call and a supportive residence. The control phone call process has been extremely effective forensically. Offenders choose to avoid trial by taking a plea bargain thereby dramatically speeding up the criminal judicial and family court processes. An additional outcome of the control phone call is the alleged offender's own words saved the child from the trauma of testifying in court. The control phone call reduced the need for children to repeat their stories to various interviewers. A successful control phone call gives the child a sense of vindication. This technique is the only technique that preserves the actual communication pattern between the alleged victim and the alleged offender. This can be of great value to the mental health professionals working with both the child and the alleged offender. Cautions must be considered regarding potential serious adverse effects on the child. The multidisciplinary team members must work together in the control phone call. The descriptive nature of this study did not allow the authors adequate demographic data, a subject that should be addressed in future prospective study.
Density-based parallel skin lesion border detection with webCL
2015-01-01
Background Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Methods Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Results Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. Conclusions When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser. PMID:26423836
Density-based parallel skin lesion border detection with webCL.
Lemon, James; Kockara, Sinan; Halic, Tansel; Mete, Mutlu
2015-01-01
Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser.
Using accelerometers to determine the calling behavior of tagged baleen whales.
Goldbogen, J A; Stimpert, A K; DeRuiter, S L; Calambokidis, J; Friedlaender, A S; Schorr, G S; Moretti, D J; Tyack, P L; Southall, B L
2014-07-15
Low-frequency acoustic signals generated by baleen whales can propagate over vast distances, making the assignment of calls to specific individuals problematic. Here, we report the novel use of acoustic recording tags equipped with high-resolution accelerometers to detect vibrations from the surface of two tagged fin whales that directly match the timing of recorded acoustic signals. A tag deployed on a buoy in the vicinity of calling fin whales and a recording from a tag that had just fallen off a whale were able to detect calls acoustically but did not record corresponding accelerometer signals that were measured on calling individuals. Across the hundreds of calls measured on two tagged fin whales, the accelerometer response was generally anisotropic across all three axes, appeared to depend on tag placement and increased with the level of received sound. These data demonstrate that high-sample rate accelerometry can provide important insights into the acoustic behavior of baleen whales that communicate at low frequencies. This method helps identify vocalizing whales, which in turn enables the quantification of call rates, a fundamental component of models used to estimate baleen whale abundance and distribution from passive acoustic monitoring. © 2014. Published by The Company of Biologists Ltd.
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.
2015-01-01
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151
NASA Technical Reports Server (NTRS)
Dhople, Arvind M.
1994-01-01
In ominous projections issued by both U.S. Public Health Service and the World Health Organization, the epidemic of HIV infection will continue to rise more rapidly worldwide than predicted earlier. The AIDS patients are susceptible to diseases called opportunistic infections of which tuberculosis and Mycobacterium avium complex (MAC) infection are most common. This has created an urgent need to uncover new drugs for the treatment of these infections. In the seventies, NASA scientists at Goddard Space Flight Center, Greenbelt, MD, had adopted a biochemical indicator, adenosine triphosphate (ATP), to detect presence of life in extraterrestrial space. We proposed to develop ATP assay technique to determine sensitivity of antibacterial compounds against MAC and M. tuberculosis.
Generation of dark hollow beam via coherent combination based on adaptive optics.
Zheng, Yi; Wang, Xiaohua; Shen, Feng; Li, Xinyang
2010-12-20
A novel method for generating a dark hollow beam (DHB) is proposed and studied both theoretically and experimentally. A coherent combination technique for laser arrays is implemented based on adaptive optics (AO). A beam arraying structure and an active segmented mirror are designed and described. Piston errors are extracted by a zero-order interference detection system with the help of a custom-made photo-detectors array. An algorithm called the extremum approach is adopted to calculate feedback control signals. A dynamic piston error is imported by LiNbO3 to test the capability of the AO servo. In a closed loop the stable and clear DHB is obtained. The experimental results confirm the feasibility of the concept.
Technology and the Environment
NASA Technical Reports Server (NTRS)
1977-01-01
Forest Service officials consulted NASA and found a solution in the application of laser technology originally developed for satellites. NASA/Goddard built a system called a "laser range pole," a portable battery operated back-packed device that allows direct sightings no matter how rough the intervening terrain or how thick the forest. The equipment consists of a laser transmitter and a receiver. From a given property marker, the transmitter pulses a laser beam vertically, several thousand feet in some cases. At a second surveying point about a mile away, the receiver detects the laser pulse high above the trees, and locks in on the exact direction. Thus provided a bearing between the two points, a ground crew can extend the border line back to the sending point by conventional surveying techniques.
Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic.
Yokoyama, Jun'ichi
2014-01-01
After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student's t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case.
Estimating the number of double-strand breaks formed during meiosis from partial observation.
Toyoizumi, Hiroshi; Tsubouchi, Hideo
2012-12-01
Analyzing the basic mechanism of DNA double-strand breaks (DSB) formation during meiosis is important for understanding sexual reproduction and genetic diversity. The location and amount of meiotic DSBs can be examined by using a common molecular biological technique called Southern blotting, but only a subset of the total DSBs can be observed; only DSB fragments still carrying the region recognized by a Southern blot probe are detected. With the assumption that DSB formation follows a nonhomogeneous Poisson process, we propose two estimators of the total number of DSBs on a chromosome: (1) an estimator based on the Nelson-Aalen estimator, and (2) an estimator based on a record value process. Further, we compared their asymptotic accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Der Luut, R.; Khan, P.M.; Van Leeuwen, C.
Familial adenomatous polyposis (FAP) is usually associated with protein truncating mutations in the adenomatous polyposis coli (APC) gene. The APC mutations are known to play a major role in colorectal carcinogensis. For the identification of protein truncating mutations of the APC gene, the authors developed a rapid, sensitive, and direct screening procedure. The technique is based on the in vitro transcription and translation of the genomic PCR products and is called the protein truncation test. Samples of DNA from individual FAP patients, members of a FAP family, colorectal tumors, and colorectal tumor-derived cell lines were used to show the effectivenessmore » of this method. 9 refs., 2 figs.« less
Correlated evolution between hearing sensitivity and social calls in bats
Bohn, Kirsten M; Moss, Cynthia F; Wilkinson, Gerald S
2006-01-01
Echolocating bats are auditory specialists, with exquisite hearing that spans several octaves. In the ultrasonic range, bat audiograms typically show highest sensitivity in the spectral region of their species-specific echolocation calls. Well-developed hearing in the audible range has been commonly attributed to a need to detect sounds produced by prey. However, bat pups often emit isolation calls with low-frequency components that facilitate mother–young reunions. In this study, we examine whether low-frequency hearing in bats exhibits correlated evolution with (i) body size; (ii) high-frequency hearing sensitivity or (iii) pup isolation call frequency. Using published audiograms, we found that low-frequency hearing sensitivity is not dependent on body size but is related to high-frequency hearing. After controlling for high-frequency hearing, we found that low-frequency hearing exhibits correlated evolution with isolation call frequency. We infer that detection and discrimination of isolation calls have favoured enhanced low-frequency hearing because accurate parental investment is critical: bats have low reproductive rates, non-volant altricial young and must often identify their pups within large crèches. PMID:17148288
Bi-spectrum based-EMD applied to the non-stationary vibration signals for bearing faults diagnosis.
Saidi, Lotfi; Ali, Jaouher Ben; Fnaiech, Farhat
2014-09-01
Empirical mode decomposition (EMD) has been widely applied to analyze vibration signals behavior for bearing failures detection. Vibration signals are almost always non-stationary since bearings are inherently dynamic (e.g., speed and load condition change over time). By using EMD, the complicated non-stationary vibration signal is decomposed into a number of stationary intrinsic mode functions (IMFs) based on the local characteristic time scale of the signal. Bi-spectrum, a third-order statistic, helps to identify phase coupling effects, the bi-spectrum is theoretically zero for Gaussian noise and it is flat for non-Gaussian white noise, consequently the bi-spectrum analysis is insensitive to random noise, which are useful for detecting faults in induction machines. Utilizing the advantages of EMD and bi-spectrum, this article proposes a joint method for detecting such faults, called bi-spectrum based EMD (BSEMD). First, original vibration signals collected from accelerometers are decomposed by EMD and a set of IMFs is produced. Then, the IMF signals are analyzed via bi-spectrum to detect outer race bearing defects. The procedure is illustrated with the experimental bearing vibration data. The experimental results show that BSEMD techniques can effectively diagnosis bearing failures. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Real-time detection of dental calculus by blue-LED-induced fluorescence spectroscopy.
Qin, Y L; Luan, X L; Bi, L J; Lü, Z; Sheng, Y Q; Somesfalean, G; Zhou, C N; Zhang, Z G
2007-05-25
Successful periodontal therapy requires sensitive techniques to discriminate dental calculus from healthy teeth. The aim of the present study was to develop a fluorescence-based procedure to enable real-time detection and quantification of dental calculus. Thirty human teeth--15 teeth with sub- and supragingival calculus and 15 healthy teeth--covered with a layer of physiological saline solution or blood were illuminated by a focused blue LED light source of 405 nm. Autofluorescence spectra recorded along a randomly selected line stretching over the crown-neck-root area of each tooth were utilized to evaluate a so called calculus parameter R, which was selected to define a relationship between the integrated intensities specific for healthy teeth and for calculus in the 477-497 nm (S(A)) and 628-685 nm (S(B)) wavelength regions, respectively. Statistical analysis was performed and a cut-off threshold of R=0.2 was found to distinguish dental calculus from healthy teeth with 100% sensitivity and specificity under various experimental conditions. The results of the spectral evaluation were confirmed by clinical and histological findings. Automated real-time detection and diagnostics for clinical use were implemented by a corresponding software program written in Visual Basic language. The method enables cost-effective and reliable calculus detection, and can be further developed for imaging applications.
Encoding techniques for complex information structures in connectionist systems
NASA Technical Reports Server (NTRS)
Barnden, John; Srinivas, Kankanahalli
1990-01-01
Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.
An effective method on pornographic images realtime recognition
NASA Astrophysics Data System (ADS)
Wang, Baosong; Lv, Xueqiang; Wang, Tao; Wang, Chengrui
2013-03-01
In this paper, skin detection, texture filtering and face detection are used to extract feature on an image library, training them with the decision tree arithmetic to create some rules as a decision tree classifier to distinguish an unknown image. Experiment based on more than twenty thousand images, the precision rate can get 76.21% when testing on 13025 pornographic images and elapsed time is less than 0.2s. This experiment shows it has a good popularity. Among the steps mentioned above, proposing a new skin detection model which called irregular polygon region skin detection model based on YCbCr color space. This skin detection model can lower the false detection rate on skin detection. A new method called sequence region labeling on binary connected area can calculate features on connected area, it is faster and needs less memory than other recursive methods.
Automatic archaeological feature extraction from satellite VHR images
NASA Astrophysics Data System (ADS)
Jahjah, Munzer; Ulivieri, Carlo
2010-05-01
Archaeological applications need a methodological approach on a variable scale able to satisfy the intra-site (excavation) and the inter-site (survey, environmental research). The increased availability of high resolution and micro-scale data has substantially favoured archaeological applications and the consequent use of GIS platforms for reconstruction of archaeological landscapes based on remotely sensed data. Feature extraction of multispectral remotely sensing image is an important task before any further processing. High resolution remote sensing data, especially panchromatic, is an important input for the analysis of various types of image characteristics; it plays an important role in the visual systems for recognition and interpretation of given data. The methods proposed rely on an object-oriented approach based on a theory for the analysis of spatial structures called mathematical morphology. The term "morphology" stems from the fact that it aims at analysing object shapes and forms. It is mathematical in the sense that the analysis is based on the set theory, integral geometry, and lattice algebra. Mathematical morphology has proven to be a powerful image analysis technique; two-dimensional grey tone images are seen as three-dimensional sets by associating each image pixel with an elevation proportional to its intensity level. An object of known shape and size, called the structuring element, is then used to investigate the morphology of the input set. This is achieved by positioning the origin of the structuring element to every possible position of the space and testing, for each position, whether the structuring element either is included or has a nonempty intersection with the studied set. The shape and size of the structuring element must be selected according to the morphology of the searched image structures. Other two feature extraction techniques were used, eCognition and ENVI module SW, in order to compare the results. These techniques were applied to different archaeological sites in Turkmenistan (Nisa) and in Iraq (Babylon); a further change detection analysis was applied to the Babylon site using two HR images as a pre-post second gulf war. We had different results or outputs, taking into consideration the fact that the operative scale of sensed data determines the final result of the elaboration and the output of the information quality, because each of them was sensitive to specific shapes in each input image, we had mapped linear and nonlinear objects, updating archaeological cartography, automatic change detection analysis for the Babylon site. The discussion of these techniques has the objective to provide the archaeological team with new instruments for the orientation and the planning of a remote sensing application.
Splitting a colon geometry with multiplanar clipping
NASA Astrophysics Data System (ADS)
Ahn, David K.; Vining, David J.; Ge, Yaorong; Stelts, David R.
1998-06-01
Virtual colonoscopy, a recent three-dimensional (3D) visualization technique, has provided radiologists with a unique diagnostic tool. Using this technique, a radiologist can examine the internal morphology of a patient's colon by navigating through a surface-rendered model that is constructed from helical computed tomography image data. Virtual colonoscopy can be used to detect early forms of colon cancer in a way that is less invasive and expensive compared to conventional endoscopy. However, the common approach of 'flying' through the colon lumen to visually search for polyps is tedious and time-consuming, especially when a radiologist loses his or her orientation within the colon. Furthermore, a radiologist's field of view is often limited by the 3D camera position located inside the colon lumen. We have developed a new technique, called multi-planar geometry clipping, that addresses these problems. Our algorithm divides a complex colon anatomy into several smaller segments, and then splits each of these segments in half for display on a static medium. Multi-planar geometry clipping eliminates virtual colonoscopy's dependence upon expensive, real-time graphics workstations by enabling radiologists to globally inspect the entire internal surface of the colon from a single viewpoint.
Recognition of human activity characteristics based on state transitions modeling technique
NASA Astrophysics Data System (ADS)
Elangovan, Vinayak; Shirkhodaie, Amir
2012-06-01
Human Activity Discovery & Recognition (HADR) is a complex, diverse and challenging task but yet an active area of ongoing research in the Department of Defense. By detecting, tracking, and characterizing cohesive Human interactional activity patterns, potential threats can be identified which can significantly improve situation awareness, particularly, in Persistent Surveillance Systems (PSS). Understanding the nature of such dynamic activities, inevitably involves interpretation of a collection of spatiotemporally correlated activities with respect to a known context. In this paper, we present a State Transition model for recognizing the characteristics of human activities with a link to a prior contextbased ontology. Modeling the state transitions between successive evidential events determines the activities' temperament. The proposed state transition model poses six categories of state transitions including: Human state transitions of Object handling, Visibility, Entity-entity relation, Human Postures, Human Kinematics and Distance to Target. The proposed state transition model generates semantic annotations describing the human interactional activities via a technique called Casual Event State Inference (CESI). The proposed approach uses a low cost kinect depth camera for indoor and normal optical camera for outdoor monitoring activities. Experimental results are presented here to demonstrate the effectiveness and efficiency of the proposed technique.
Introduction to the CEA family: structure, function and secretion.
Von Kleist, S
1992-01-01
Due to the phenomenal progress in the field of tumor immunology that took place during the last twenty years, we dispose today of highly specific and sensitive techniques and reagents like monoclonal antibodies (MAbs). In this context the discovery in human carcinomas of tumor-associated antigens, such as CEA, was of primary importance, especially since the latter was found to have clinical relevance as a tumor marker. Based on animal models, a new in vivo technology for the detection of tumors and metastases was developed in recent years, that uses anti-CEA MAbs, or fragments of them, coupled to radio-isotopes. This technique, called radio-immunodetection (RAID), also paved the way for immunotherapeutic procedures, where again CEA served as the target-antigen. This new technique holds great promise, provided the epitope-specificity of the MAbs is well-controlled: it has been shown that CEA belongs to a large gene-family of at least 22 members, which can be subdivided into two subgroups (i.e., the CEA- and the PSG-subgroup) and which in turn belongs to the immunoglobulin-supergene family. Great structural similarities render the distinction of the various cross-reactive molecules by immunological means rather difficult.