Vogel, Anja; Fester, Thomas; Eisenhauer, Nico; Scherer-Lorenzen, Michael; Schmid, Bernhard; Weisser, Wolfgang W.; Weigelt, Alexandra
2013-01-01
1 Given the predictions of increased drought probabilities under various climate change scenarios, there have been numerous experimental field studies simulating drought using transparent roofs in different ecosystems and regions. Such roofs may, however, have unknown side effects, called artifacts, on the measured variables potentially confounding the experimental results. A roofed control allows the quantification of potential artifacts, which is lacking in most experiments. 2 We conducted a drought experiment in experimental grasslands to study artifacts of transparent roofs and the resulting effects of artifacts on ecosystems relative to drought on three response variables (aboveground biomass, litter decomposition and plant metabolite profiles). We established three drought treatments, using (1) transparent roofs to exclude rainfall, (2) an unroofed control treatment receiving natural rainfall and (3) a roofed control, nested in the drought treatment but with rain water reapplied according to ambient conditions. 3 Roofs had a slight impact on air (+0.14°C during night) and soil temperatures (−0.45°C on warm days, +0.25°C on cold nights), while photosynthetically active radiation was decreased significantly (−16%). Aboveground plant community biomass was reduced in the drought treatment (−41%), but there was no significant difference between the roofed and unroofed control, i.e., there were no measurable roof artifact effects. 4 Compared to the unroofed control, litter decomposition was decreased significantly both in the drought treatment (−26%) and in the roofed control treatment (−18%), suggesting artifact effects of the transparent roofs. Moreover, aboveground metabolite profiles in the model plant species Medicago x varia were different from the unroofed control in both the drought and roofed control treatments, and roof artifact effects were of comparable magnitude as drought effects. 5 Our results stress the need for roofed control treatments when using transparent roofs for studying drought effects, because roofs can cause significant side effects. PMID:23936480
Recchia, Gabriel L; Louwerse, Max M
2016-11-01
Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Recchia, Gabriel L.; Louwerse, Max M.
2016-01-01
Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley…
Desired Accuracy Estimation of Noise Function from ECG Signal by Fuzzy Approach
Vahabi, Zahra; Kermani, Saeed
2012-01-01
Unknown noise and artifacts present in medical signals with non-linear fuzzy filter will be estimated and then removed. An adaptive neuro-fuzzy interference system which has a non-linear structure presented for the noise function prediction by before Samples. This paper is about a neuro-fuzzy method to estimate unknown noise of Electrocardiogram signal. Adaptive neural combined with Fuzzy System to construct a fuzzy Predictor. For this system setting parameters such as the number of Membership Functions for each input and output, training epochs, type of MFs for each input and output, learning algorithm and etc. is determined by learning data. At the end simulated experimental results are presented for proper validation. PMID:23717810
NASA Astrophysics Data System (ADS)
Allman, Derek; Reiter, Austin; Bell, Muyinatu
2018-02-01
We previously proposed a method of removing reflection artifacts in photoacoustic images that uses deep learning. Our approach generally relies on using simulated photoacoustic channel data to train a convolutional neural network (CNN) that is capable of distinguishing sources from artifacts based on unique differences in their spatial impulse responses (manifested as depth-based differences in wavefront shapes). In this paper, we directly compare a CNN trained with our previous continuous transducer model to a CNN trained with an updated discrete acoustic receiver model that more closely matches an experimental ultrasound transducer. These two CNNs were trained with simulated data and tested on experimental data. The CNN trained using the continuous receiver model correctly classified 100% of sources and 70.3% of artifacts in the experimental data. In contrast, the CNN trained using the discrete receiver model correctly classified 100% of sources and 89.7% of artifacts in the experimental images. The 19.4% increase in artifact classification accuracy indicates that an acoustic receiver model that closely mimics the experimental transducer plays an important role in improving the classification of artifacts in experimental photoacoustic data. Results are promising for developing a method to display CNN-based images that remove artifacts in addition to only displaying network-identified sources as previously proposed.
Form Follows Function: Learning about Function Helps Children Learn about Shape
ERIC Educational Resources Information Center
Ware, Elizabeth A.; Booth, Amy E.
2010-01-01
Object functions help young children to organize new artifact categories. However, the scope of their influence is unknown. We explore whether functions highlight property dimensions that are relevant to artifact categories in general. Specifically, using a longitudinal training procedure, we assessed whether experience with functions highlights…
Artifact removal from EEG data with empirical mode decomposition
NASA Astrophysics Data System (ADS)
Grubov, Vadim V.; Runnova, Anastasiya E.; Efremova, Tatyana Yu.; Hramov, Alexander E.
2017-03-01
In the paper we propose the novel method for dealing with the physiological artifacts caused by intensive activity of facial and neck muscles and other movements in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We introduce the mathematical algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from movement artifacts and show high efficiency of the method.
Filtration of human EEG recordings from physiological artifacts with empirical mode method
NASA Astrophysics Data System (ADS)
Grubov, Vadim V.; Runnova, Anastasiya E.; Khramova, Marina V.
2017-03-01
In the paper we propose the new method for dealing with noise and physiological artifacts in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We consider noises and physiological artifacts on EEG as specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from eye-moving artifacts and show high efficiency of the method.
NASA Astrophysics Data System (ADS)
Grubov, V. V.; Runnova, A. E.; Hramov, A. E.
2018-05-01
A new method for adaptive filtration of experimental EEG signals in humans and for removal of different physiological artifacts has been proposed. The algorithm of the method includes empirical mode decomposition of EEG, determination of the number of empirical modes that are considered, analysis of the empirical modes and search for modes that contains artifacts, removal of these modes, and reconstruction of the EEG signal. The method was tested on experimental human EEG signals and demonstrated high efficiency in the removal of different types of physiological EEG artifacts.
Reduction of variable-truncation artifacts from beam occlusion during in situ x-ray tomography
NASA Astrophysics Data System (ADS)
Borg, Leise; Jørgensen, Jakob S.; Frikel, Jürgen; Sporring, Jon
2017-12-01
Many in situ x-ray tomography studies require experimental rigs which may partially occlude the beam and cause parts of the projection data to be missing. In a study of fluid flow in porous chalk using a percolation cell with four metal bars drastic streak artifacts arise in the filtered backprojection (FBP) reconstruction at certain orientations. Projections with non-trivial variable truncation caused by the metal bars are the source of these variable-truncation artifacts. To understand the artifacts a mathematical model of variable-truncation data as a function of metal bar radius and distance to sample is derived and verified numerically and with experimental data. The model accurately describes the arising variable-truncation artifacts across simulated variations of the experimental setup. Three variable-truncation artifact-reduction methods are proposed, all aimed at addressing sinogram discontinuities that are shown to be the source of the streaks. The ‘reduction to limited angle’ (RLA) method simply keeps only non-truncated projections; the ‘detector-directed smoothing’ (DDS) method smooths the discontinuities; while the ‘reflexive boundary condition’ (RBC) method enforces a zero derivative at the discontinuities. Experimental results using both simulated and real data show that the proposed methods effectively reduce variable-truncation artifacts. The RBC method is found to provide the best artifact reduction and preservation of image features using both visual and quantitative assessment. The analysis and artifact-reduction methods are designed in context of FBP reconstruction motivated by computational efficiency practical for large, real synchrotron data. While a specific variable-truncation case is considered, the proposed methods can be applied to general data cut-offs arising in different in situ x-ray tomography experiments.
Lithic Scatters that Blow: Wind as an Agent of Secondary Deposition of Lithic Artifacts
USDA-ARS?s Scientific Manuscript database
Artifact presence or absence is frequently the only criteria used to define the horizontal extent of archaeological sites. Artifact transport by natural agents such as water and gravity is known to move artifacts from their primary context, though experimental simulated wind conditions demonstrate t...
Mauldin, F William; Owen, Kevin; Tiouririne, Mohamed; Hossack, John A
2012-06-01
The portability, low cost, and non-ionizing radiation associated with medical ultrasound suggest that it has potential as a superior alternative to X-ray for bone imaging. However, when conventional ultrasound imaging systems are used for bone imaging, clinical acceptance is frequently limited by artifacts derived from reflections occurring away from the main axis of the acoustic beam. In this paper, the physical source of off-axis artifacts and the effect of transducer geometry on these artifacts are investigated in simulation and experimental studies. In agreement with diffraction theory, the sampled linear-array geometry possessed increased off-axis energy compared with single-element piston geometry, and therefore, exhibited greater levels of artifact signal. Simulation and experimental results demonstrated that the linear-array geometry exhibited increased artifact signal when the center frequency increased, when energy off-axis to the main acoustic beam (i.e., grating lobes) was perpendicularly incident upon off-axis surfaces, and when off-axis surfaces were specular rather than diffusive. The simulation model used to simulate specular reflections was validated experimentally and a correlation coefficient of 0.97 between experimental and simulated peak reflection contrast was observed. In ex vivo experiments, the piston geometry yielded 4 and 6.2 dB average contrast improvement compared with the linear array when imaging the spinous process and interlaminar space of an animal spine, respectively. This work indicates that off-axis reflections are a major source of ultrasound image artifacts, particularly in environments comprising specular reflecting (i.e., bone or bone-like) objects. Transducer geometries with reduced sensitivity to off-axis surface reflections, such as a piston transducer geometry, yield significant reductions in image artifact.
[Quantitative Evaluation of Metal Artifacts on CT Images on the Basis of Statistics of Extremes].
Kitaguchi, Shigetoshi; Imai, Kuniharu; Ueda, Suguru; Hashimoto, Naomi; Hattori, Shouta; Saika, Takahiro; Ono, Yoshifumi
2016-05-01
It is well-known that metal artifacts have a harmful effect on the image quality of computed tomography (CT) images. However, the physical property remains still unknown. In this study, we investigated the relationship between metal artifacts and tube currents using statistics of extremes. A commercially available phantom for measuring CT dose index 160 mm in diameter was prepared and a brass rod 13 mm in diameter was placed at the centerline of the phantom. This phantom was used as a target object to evaluate metal artifacts and was scanned using an area detector CT scanner with various tube currents under a constant tube voltage of 120 kV. Sixty parallel line segments with a length of 100 pixels were placed to cross metal artifacts on CT images and the largest difference between two adjacent CT values in each of 60 CT value profiles of these line segments was employed as a feature variable for measuring metal artifacts; these feature variables were analyzed on the basis of extreme value theory. The CT value variation induced by metal artifacts was statistically characterized by Gumbel distribution, which was one of the extreme value distributions; namely, metal artifacts have the same statistical characteristic as streak artifacts. Therefore, Gumbel evaluation method makes it possible to analyze not only streak artifacts but also metal artifacts. Furthermore, the location parameter in Gumbel distribution was shown to be in inverse proportion to the square root of a tube current. This result suggested that metal artifacts have the same dose dependence as image noises.
Yuki, I; Kambayashi, Y; Ikemura, A; Abe, Y; Kan, I; Mohamed, A; Dahmani, C; Suzuki, T; Ishibashi, T; Takao, H; Urashima, M; Murayama, Y
2016-02-01
Combination of high-resolution C-arm CT and novel metal artifact reduction software may contribute to the assessment of aneurysms treated with stent-assisted coil embolization. This study aimed to evaluate the efficacy of a novel Metal Artifact Reduction prototype software combined with the currently available high spatial-resolution C-arm CT prototype implementation by using an experimental aneurysm model treated with stent-assisted coil embolization. Eight experimental aneurysms were created in 6 swine. Coil embolization of each aneurysm was performed by using a stent-assisted technique. High-resolution C-arm CT with intra-arterial contrast injection was performed immediately after the treatment. The obtained images were processed with Metal Artifact Reduction. Five neurointerventional specialists reviewed the image quality before and after Metal Artifact Reduction. Observational and quantitative analyses (via image analysis software) were performed. Every aneurysm was successfully created and treated with stent-assisted coil embolization. Before Metal Artifact Reduction, coil loops protruding through the stent lumen were not visualized due to the prominent metal artifacts produced by the coils. These became visible after Metal Artifact Reduction processing. Contrast filling in the residual aneurysm was also visualized after Metal Artifact Reduction in every aneurysm. Both the observational (P < .0001) and quantitative (P < .001) analyses showed significant reduction of the metal artifacts after application of the Metal Artifact Reduction prototype software. The combination of high-resolution C-arm CT and Metal Artifact Reduction enables differentiation of the coil mass, stent, and contrast material on the same image by significantly reducing the metal artifacts produced by the platinum coils. This novel image technique may improve the assessment of aneurysms treated with stent-assisted coil embolization. © 2016 by American Journal of Neuroradiology.
Stone, David B.; Tamburro, Gabriella; Fiedler, Patrique; Haueisen, Jens; Comani, Silvia
2018-01-01
Data contamination due to physiological artifacts such as those generated by eyeblinks, eye movements, and muscle activity continues to be a central concern in the acquisition and analysis of electroencephalographic (EEG) data. This issue is further compounded in EEG sports science applications where the presence of artifacts is notoriously difficult to control because behaviors that generate these interferences are often the behaviors under investigation. Therefore, there is a need to develop effective and efficient methods to identify physiological artifacts in EEG recordings during sports applications so that they can be isolated from cerebral activity related to the activities of interest. We have developed an EEG artifact detection model, the Fingerprint Method, which identifies different spatial, temporal, spectral, and statistical features indicative of physiological artifacts and uses these features to automatically classify artifactual independent components in EEG based on a machine leaning approach. Here, we optimized our method using artifact-rich training data and a procedure to determine which features were best suited to identify eyeblinks, eye movements, and muscle artifacts. We then applied our model to an experimental dataset collected during endurance cycling. Results reveal that unique sets of features are suitable for the detection of distinct types of artifacts and that the Optimized Fingerprint Method was able to correctly identify over 90% of the artifactual components with physiological origin present in the experimental data. These results represent a significant advancement in the search for effective means to address artifact contamination in EEG sports science applications. PMID:29618975
Stone, David B; Tamburro, Gabriella; Fiedler, Patrique; Haueisen, Jens; Comani, Silvia
2018-01-01
Data contamination due to physiological artifacts such as those generated by eyeblinks, eye movements, and muscle activity continues to be a central concern in the acquisition and analysis of electroencephalographic (EEG) data. This issue is further compounded in EEG sports science applications where the presence of artifacts is notoriously difficult to control because behaviors that generate these interferences are often the behaviors under investigation. Therefore, there is a need to develop effective and efficient methods to identify physiological artifacts in EEG recordings during sports applications so that they can be isolated from cerebral activity related to the activities of interest. We have developed an EEG artifact detection model, the Fingerprint Method, which identifies different spatial, temporal, spectral, and statistical features indicative of physiological artifacts and uses these features to automatically classify artifactual independent components in EEG based on a machine leaning approach. Here, we optimized our method using artifact-rich training data and a procedure to determine which features were best suited to identify eyeblinks, eye movements, and muscle artifacts. We then applied our model to an experimental dataset collected during endurance cycling. Results reveal that unique sets of features are suitable for the detection of distinct types of artifacts and that the Optimized Fingerprint Method was able to correctly identify over 90% of the artifactual components with physiological origin present in the experimental data. These results represent a significant advancement in the search for effective means to address artifact contamination in EEG sports science applications.
Removal of ring artifacts in microtomography by characterization of scintillator variations.
Vågberg, William; Larsson, Jakob C; Hertz, Hans M
2017-09-18
Ring artifacts reduce image quality in tomography, and arise from faulty detector calibration. In microtomography, we have identified that ring artifacts can arise due to high-spatial frequency variations in the scintillator thickness. Such variations are normally removed by a flat-field correction. However, as the spectrum changes, e.g. due to beam hardening, the detector response varies non-uniformly introducing ring artifacts that persist after flat-field correction. In this paper, we present a method to correct for ring artifacts from variations in scintillator thickness by using a simple method to characterize the local scintillator response. The method addresses the actual physical cause of the ring artifacts, in contrary to many other ring artifact removal methods which rely only on image post-processing. By applying the technique to an experimental phantom tomography, we show that ring artifacts are strongly reduced compared to only making a flat-field correction.
NASA Astrophysics Data System (ADS)
Shirai, Tomohiro; Friberg, Ari T.
2018-04-01
Dispersion-canceled optical coherence tomography (OCT) based on spectral intensity interferometry was devised as a classical counterpart of quantum OCT to enhance the basic performance of conventional OCT. In this paper, we demonstrate experimentally that an alternative method of realizing this kind of OCT by means of two optical fiber couplers and a single spectrometer is a more practical and reliable option than the existing methods proposed previously. Furthermore, we develop a recipe for reducing multiple artifacts simultaneously on the basis of simple averaging and verify experimentally that it works successfully in the sense that all the artifacts are mitigated effectively and only the true signals carrying structural information about the sample survive.
The effect of heat acclimation on sweat microminerals: Artifact of surface contamination
USDA-ARS?s Scientific Manuscript database
Heat acclimation (HA) reportedly conveys conservation in sweat micromineral concentrations when sampled from arm sweat, but time course is unknown. The observation that comprehensive cleaning of the skin surface negates sweat micromineral reductions during prolonged sweating raises the question of w...
Roth, Bradley J.
2002-09-01
Insidious experimental artifacts and invalid theoretical assumptions complicate the comparison of numerical predictions and observed data. Such difficulties are particularly troublesome when studying electrical stimulation of the heart. During unipolar stimulation of cardiac tissue, the artifacts include nonlinearity of membrane dyes, optical signals blocked by the stimulating electrode, averaging of optical signals with depth, lateral averaging of optical signals, limitations of the current source, and the use of excitation-contraction uncouplers. The assumptions involve electroporation, membrane models, electrode size, the perfusing bath, incorrect model parameters, the applicability of a continuum model, and tissue damage. Comparisons of theory and experiment during far-field stimulation are limited by many of these same factors, plus artifacts from plunge and epicardial recording electrodes and assumptions about the fiber angle at an insulating boundary. These pitfalls must be overcome in order to understand quantitatively how the heart responds to an electrical stimulus. (c) 2002 American Institute of Physics.
Ernstberger, T; Buchhorn, G; Heidrich, G
2010-03-01
Intervertebral spacers are made of different materials, which can affect the postfusion magnetic resonance imaging (MRI) scans. Susceptibility artifacts, especially for metallic implants, can decrease the image quality. This study aimed to determine whether magnesium as a lightweight and biocompatible metal is suitable as a biomaterial for spinal implants based on its MRI artifacting behavior. To compare artifacting behaviors, we implanted into one porcine cadaveric spine different test spacers made of magnesium, titanium, and CFRP. All test spacers were scanned using two T1-TSE MRI sequences. The artifact dimensions were traced on all scans and statistically analyzed. The total artifact volume and median artifact area of the titanium spacers were statistically significantly larger than magnesium spacers (P < 0.001), while magnesium and CFRP spacers produced almost identical artifacting behaviors (P > 0.05). Our results suggest that spinal implants made with magnesium alloys will behave more like CFRP devices in MRI scans.
Contribution of computed tomography to the investigation of La Tene culture iron artefacts
NASA Astrophysics Data System (ADS)
Vopálenský, M.; Sankot, P.; Fořt, M.; Kumpová, I.; Vavřík, D.
2017-07-01
The X-ray tomographic study was realized in addition to the standard X-ray radiography for the purpose of the new conservation work upon the La Tene culture iron artifacts from the collections of the National Museum in Prague. These artifacts are heavily damaged by the corrosion, avoiding thus an effective visual exploration. The work shows that even details, which are shallow compared to the artifact thickness and therefore not detectable in standard radiographic images, can be made visible in 3D models obtained tomografically. The tomographic data acquisition was performed utilizing the unique TORATOM device, equipped with a large area X-ray detector with Gadox scintillator. The tomographic reconstruction revealed insufficiencies in the earlier conservation processes of the La Tene culture swords, as well as so-far unknown details, such as the exact sword shapes and their decoration. These new findings allowed better classifying of the artifacts. Tomography also helped in visualizing details of iron clips that are completely hidden under the rust, making thus the technology of the clip formation clearly observable. With this work, it has been proven that tomography can bear valuable new information compared to the standard X-ray radiography commonly used in the investigation of iron archeological artifacts.
A level set method for cupping artifact correction in cone-beam CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Shipeng; Li, Haibo; Ge, Qi
2015-08-15
Purpose: To reduce cupping artifacts and improve the contrast-to-noise ratio in cone-beam computed tomography (CBCT). Methods: A level set method is proposed to reduce cupping artifacts in the reconstructed image of CBCT. The authors derive a local intensity clustering property of the CBCT image and define a local clustering criterion function of the image intensities in a neighborhood of each point. This criterion function defines an energy in terms of the level set functions, which represent a segmentation result and the cupping artifacts. The cupping artifacts are estimated as a result of minimizing this energy. Results: The cupping artifacts inmore » CBCT are reduced by an average of 90%. The results indicate that the level set-based algorithm is practical and effective for reducing the cupping artifacts and preserving the quality of the reconstructed image. Conclusions: The proposed method focuses on the reconstructed image without requiring any additional physical equipment, is easily implemented, and provides cupping correction through a single-scan acquisition. The experimental results demonstrate that the proposed method successfully reduces the cupping artifacts.« less
Chu, Mei-Lan; Chang, Hing-Chiu; Chung, Hsiao-Wen; Truong, Trong-Kha; Bashir, Mustafa R.; Chen, Nan-kuei
2014-01-01
Purpose A projection onto convex sets reconstruction of multiplexed sensitivity encoded MRI (POCSMUSE) is developed to reduce motion-related artifacts, including respiration artifacts in abdominal imaging and aliasing artifacts in interleaved diffusion weighted imaging (DWI). Theory Images with reduced artifacts are reconstructed with an iterative POCS procedure that uses the coil sensitivity profile as a constraint. This method can be applied to data obtained with different pulse sequences and k-space trajectories. In addition, various constraints can be incorporated to stabilize the reconstruction of ill-conditioned matrices. Methods The POCSMUSE technique was applied to abdominal fast spin-echo imaging data, and its effectiveness in respiratory-triggered scans was evaluated. The POCSMUSE method was also applied to reduce aliasing artifacts due to shot-to-shot phase variations in interleaved DWI data corresponding to different k-space trajectories and matrix condition numbers. Results Experimental results show that the POCSMUSE technique can effectively reduce motion-related artifacts in data obtained with different pulse sequences, k-space trajectories and contrasts. Conclusion POCSMUSE is a general post-processing algorithm for reduction of motion-related artifacts. It is compatible with different pulse sequences, and can also be used to further reduce residual artifacts in data produced by existing motion artifact reduction methods. PMID:25394325
Demand artifact: objectively detecting biased participants in advertising research.
Miller, Felicia; Schertzer, Susan
2014-12-01
Detecting and reducing the effect of biased participants continues to be an important task for researchers. However, the lack of objective measures to assess demand artifact has made it difficult to effectively address this issue. This paper reports two experiments that apply a theory-based post-experimental inquiry that can systematically identify biased participants in consumer research. The results demonstrate how easily and effectively researchers can incorporate this tool into experimental studies of all types and reduce the likelihood of systematic error.
Isolating gait-related movement artifacts in electroencephalography during human walking
Kline, Julia E.; Huang, Helen J.; Snyder, Kristine L.; Ferris, Daniel P.
2016-01-01
Objective High-density electroencephelography (EEG) can provide insight into human brain function during real-world activities with walking. Some recent studies have used EEG to characterize brain activity during walking, but the relative contributions of movement artifact and electrocortical activity have been difficult to quantify. We aimed to characterize movement artifact recorded by EEG electrodes at a range of walking speeds and to test the efficacy of artifact removal methods. We also quantified the similarity between movement artifact recorded by EEG electrodes and a head-mounted accelerometer. Approach We used a novel experimental method to isolate and record movement artifact with EEG electrodes during walking. We blocked electrophysiological signals using a nonconductive layer (silicone swim cap) and simulated an electrically conductive scalp on top of the swim cap using a wig coated with conductive gel. We recorded motion artifact EEG data from nine young human subjects walking on a treadmill at speeds from 0.4–1.6 m/s. We then tested artifact removal methods including moving average and wavelet-based techniques. Main Results Movement artifact recorded with EEG electrodes varied considerably, across speed, subject, and electrode location. The movement artifact measured with EEG electrodes did not correlate well with head acceleration. All of the tested artifact removal methods attenuated low-frequency noise but did not completely remove movement artifact. The spectral power fluctuations in the movement artifact data resembled data from some previously published studies of EEG during walking. Significance Our results suggest that EEG data recorded during walking likely contains substantial movement artifact that: cannot be explained by head accelerations; varies across speed, subject, and channel; and cannot be removed using traditional signal processing methods. Future studies should focus on more sophisticated methods for removing of EEG movement artifact to advance the field. PMID:26083595
Isolating gait-related movement artifacts in electroencephalography during human walking.
Kline, Julia E; Huang, Helen J; Snyder, Kristine L; Ferris, Daniel P
2015-08-01
High-density electroencephelography (EEG) can provide an insight into human brain function during real-world activities with walking. Some recent studies have used EEG to characterize brain activity during walking, but the relative contributions of movement artifact and electrocortical activity have been difficult to quantify. We aimed to characterize movement artifact recorded by EEG electrodes at a range of walking speeds and to test the efficacy of artifact removal methods. We also quantified the similarity between movement artifact recorded by EEG electrodes and a head-mounted accelerometer. We used a novel experimental method to isolate and record movement artifact with EEG electrodes during walking. We blocked electrophysiological signals using a nonconductive layer (silicone swim cap) and simulated an electrically conductive scalp on top of the swim cap using a wig coated with conductive gel. We recorded motion artifact EEG data from nine young human subjects walking on a treadmill at speeds from 0.4 to 1.6 m s(-1). We then tested artifact removal methods including moving average and wavelet-based techniques. Movement artifact recorded with EEG electrodes varied considerably, across speed, subject, and electrode location. The movement artifact measured with EEG electrodes did not correlate well with head acceleration. All of the tested artifact removal methods attenuated low-frequency noise but did not completely remove movement artifact. The spectral power fluctuations in the movement artifact data resembled data from some previously published studies of EEG during walking. Our results suggest that EEG data recorded during walking likely contains substantial movement artifact that: cannot be explained by head accelerations; varies across speed, subject, and channel; and cannot be removed using traditional signal processing methods. Future studies should focus on more sophisticated methods for removal of EEG movement artifact to advance the field.
Text Signals Influence Team Artifacts
ERIC Educational Resources Information Center
Clariana, Roy B.; Rysavy, Monica D.; Taricani, Ellen
2015-01-01
This exploratory quasi-experimental investigation describes the influence of text signals on team visual map artifacts. In two course sections, four-member teams were given one of two print-based text passage versions on the course-related topic "Social influence in groups" downloaded from Wikipedia; this text had two paragraphs, each…
Effects of Filtering on Experimental Blast Overpressure Measurements.
Alphonse, Vanessa D; Kemper, Andrew R; Duma, Stefan M
2015-01-01
When access to live-fire test facilities is limited, experimental studies of blast-related injuries necessitate the use of a shock tube or Advanced Blast Simulator (ABS) to mimic free-field blast overpressure. However, modeling blast overpressure in a laboratory setting potentially introduces experimental artifacts in measured responses. Due to the high sampling rates required to capture a blast overpressure event, proximity to alternating current (AC-powered electronics) and poorly strain-relieved or unshielded wires can result in artifacts in the recorded overpressure trace. Data in this study were collected for tests conducted on an empty ABS (Empty Tube) using high frequency pressure sensors specifically designed for blast loading rates (n=5). Additionally, intraocular overpressure data (IOP) were collected for porcine eyes potted inside synthetic orbits located inside the ABS using an unshielded miniature pressure sensor (n=3). All tests were conducted at a 30 psi static overpressure level. A 4th order phaseless low pass Butterworth software filter was applied to the data. Various cutoff frequencies were examined to determine if the raw shock wave parameters values could be preserved while eliminating noise and artifacts. A Fast Fourier Transform (FFT) was applied to each test to examine the frequency spectra of the raw and filtered signals. Shock wave parameters (time of arrival, peak overpressure, positive duration, and positive impulse) were quantified using a custom MATLAB® script. Lower cutoff frequencies attenuated the raw signal, effectively decreasing the peak overpressure and increasing the positive duration. Rise time was not preserved the filtered data. A CFC 6000 filter preserved the remaining shock wave parameters within ±2.5% of the average raw values for the Empty Tube test data. A CFC 7000 filter removed experimental high-frequency artifacts and preserved the remaining shock wave parameters within ±2.5% of the average raw values for test IOP test data. Though the region of interest of the signals examined in the current study did not contain extremely high frequency content, it is possible that live-fire testing may produce shock waves with higher frequency content. While post-processing filtering can remove experimental artifacts, special care should be taken to minimize or eliminate the possibility of recording these artifacts in the first place.
Gaussian diffusion sinogram inpainting for X-ray CT metal artifact reduction.
Peng, Chengtao; Qiu, Bensheng; Li, Ming; Guan, Yihui; Zhang, Cheng; Wu, Zhongyi; Zheng, Jian
2017-01-05
Metal objects implanted in the bodies of patients usually generate severe streaking artifacts in reconstructed images of X-ray computed tomography, which degrade the image quality and affect the diagnosis of disease. Therefore, it is essential to reduce these artifacts to meet the clinical demands. In this work, we propose a Gaussian diffusion sinogram inpainting metal artifact reduction algorithm based on prior images to reduce these artifacts for fan-beam computed tomography reconstruction. In this algorithm, prior information that originated from a tissue-classified prior image is used for the inpainting of metal-corrupted projections, and it is incorporated into a Gaussian diffusion function. The prior knowledge is particularly designed to locate the diffusion position and improve the sparsity of the subtraction sinogram, which is obtained by subtracting the prior sinogram of the metal regions from the original sinogram. The sinogram inpainting algorithm is implemented through an approach of diffusing prior energy and is then solved by gradient descent. The performance of the proposed metal artifact reduction algorithm is compared with two conventional metal artifact reduction algorithms, namely the interpolation metal artifact reduction algorithm and normalized metal artifact reduction algorithm. The experimental datasets used included both simulated and clinical datasets. By evaluating the results subjectively, the proposed metal artifact reduction algorithm causes fewer secondary artifacts than the two conventional metal artifact reduction algorithms, which lead to severe secondary artifacts resulting from impertinent interpolation and normalization. Additionally, the objective evaluation shows the proposed approach has the smallest normalized mean absolute deviation and the highest signal-to-noise ratio, indicating that the proposed method has produced the image with the best quality. No matter for the simulated datasets or the clinical datasets, the proposed algorithm has reduced the metal artifacts apparently.
Focal volume optics and experimental artifacts in confocal fluorescence correlation spectroscopy.
Hess, Samuel T; Webb, Watt W
2002-01-01
Fluorescence correlation spectroscopy (FCS) can provide a wealth of information about biological and chemical systems on a broad range of time scales (<1 micros to >1 s). Numerical modeling of the FCS observation volume combined with measurements has revealed, however, that the standard assumption of a three-dimensional Gaussian FCS observation volume is not a valid approximation under many common measurement conditions. As a result, the FCS autocorrelation will contain significant, systematic artifacts that are most severe with confocal optics when using a large detector aperture and aperture-limited illumination. These optical artifacts manifest themselves in the fluorescence correlation as an apparent additional exponential component or diffusing species with significant (>30%) amplitude that can imply extraneous kinetics, shift the measured diffusion time by as much as approximately 80%, and cause the axial ratio to diverge. Artifacts can be minimized or virtually eliminated by using a small confocal detector aperture, underfilled objective back-aperture, or two-photon excitation. However, using a detector aperture that is smaller or larger than the optimal value (approximately 4.5 optical units) greatly reduces both the count rate per molecule and the signal-to-noise ratio. Thus, there is a tradeoff between optimizing signal-to-noise and reducing experimental artifacts in one-photon FCS. PMID:12324447
An indoor navigation system for the visually impaired.
Guerrero, Luis A; Vasquez, Francisco; Ochoa, Sergio F
2012-01-01
Navigation in indoor environments is highly challenging for the severely visually impaired, particularly in spaces visited for the first time. Several solutions have been proposed to deal with this challenge. Although some of them have shown to be useful in real scenarios, they involve an important deployment effort or use artifacts that are not natural for blind users. This paper presents an indoor navigation system that was designed taking into consideration usability as the quality requirement to be maximized. This solution enables one to identify the position of a person and calculates the velocity and direction of his movements. Using this information, the system determines the user's trajectory, locates possible obstacles in that route, and offers navigation information to the user. The solution has been evaluated using two experimental scenarios. Although the results are still not enough to provide strong conclusions, they indicate that the system is suitable to guide visually impaired people through an unknown built environment.
Mannan, Malik M Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M Ahmad
2016-02-19
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.
Psycho-physiological effects of visual artifacts by stereoscopic display systems
NASA Astrophysics Data System (ADS)
Kim, Sanghyun; Yoshitake, Junki; Morikawa, Hiroyuki; Kawai, Takashi; Yamada, Osamu; Iguchi, Akihiko
2011-03-01
The methods available for delivering stereoscopic (3D) display using glasses can be classified as time-multiplexing and spatial-multiplexing. With both methods, intrinsic visual artifacts result from the generation of the 3D image pair on a flat panel display device. In the case of the time-multiplexing method, an observer perceives three artifacts: flicker, the Mach-Dvorak effect, and a phantom array. These only occur under certain conditions, with flicker appearing in any conditions, the Mach-Dvorak effect during smooth pursuit eye movements (SPM), and a phantom array during saccadic eye movements (saccade). With spatial-multiplexing, the artifacts are temporal-parallax (due to the interlaced video signal), binocular rivalry, and reduced spatial resolution. These artifacts are considered one of the major impediments to the safety and comfort of 3D display users. In this study, the implications of the artifacts for the safety and comfort are evaluated by examining the psychological changes they cause through subjective symptoms of fatigue and the depth sensation. Physiological changes are also measured as objective responses based on analysis of heart and brain activation by visual artifacts. Further, to understand the characteristics of each artifact and the combined effects of the artifacts, four experimental conditions are developed and tested. The results show that perception of artifacts differs according to the visual environment and the display method. Furthermore visual fatigue and the depth sensation are influenced by the individual characteristics of each artifact. Similarly, heart rate variability and regional cerebral oxygenation changes by perception of artifacts in conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ophus, Colin; Ciston, Jim; Nelson, Chris T.
Unwanted motion of the probe with respect to the sample is a ubiquitous problem in scanning probe and scanning transmission electron microscopies, causing both linear and nonlinear artifacts in experimental images. We have designed a procedure to correct these artifacts by using orthogonal scan pairs to align each measurement line-by-line along the slow scan direction, by fitting contrast variation along the lines. We demonstrate the accuracy of our algorithm on both synthetic and experimental data and provide an implementation of our method.
Ophus, Colin; Ciston, Jim; Nelson, Chris T.
2015-12-10
Unwanted motion of the probe with respect to the sample is a ubiquitous problem in scanning probe and scanning transmission electron microscopies, causing both linear and nonlinear artifacts in experimental images. We have designed a procedure to correct these artifacts by using orthogonal scan pairs to align each measurement line-by-line along the slow scan direction, by fitting contrast variation along the lines. We demonstrate the accuracy of our algorithm on both synthetic and experimental data and provide an implementation of our method.
Chang, Hing-Chiu; Chen, Nan-kuei
2016-01-01
Diffusion-weighted imaging (DWI) obtained with interleaved echo-planar imaging (EPI) pulse sequence has great potential of characterizing brain tissue properties at high spatial-resolution. However, interleaved EPI based DWI data may be corrupted by various types of aliasing artifacts. First, inconsistencies in k-space data obtained with opposite readout gradient polarities result in Nyquist artifact, which is usually reduced with 1D phase correction in post-processing. When there exist eddy current cross terms (e.g., in oblique-plane EPI), 2D phase correction is needed to effectively reduce Nyquist artifact. Second, minuscule motion induced phase inconsistencies in interleaved DWI scans result in image-domain aliasing artifact, which can be removed with reconstruction procedures that take shot-to-shot phase variations into consideration. In existing interleaved DWI reconstruction procedures, Nyquist artifact and minuscule motion-induced aliasing artifact are typically removed subsequently in two stages. Although the two-stage phase correction generally performs well for non-oblique plane EPI data obtained from well-calibrated system, the residual artifacts may still be pronounced in oblique-plane EPI data or when there exist eddy current cross terms. To address this challenge, here we report a new composite 2D phase correction procedure, which effective removes Nyquist artifact and minuscule motion induced aliasing artifact jointly in a single step. Our experimental results demonstrate that the new 2D phase correction method can much more effectively reduce artifacts in interleaved EPI based DWI data as compared with the existing two-stage artifact correction procedures. The new method robustly enables high-resolution DWI, and should prove highly valuable for clinical uses and research studies of DWI. PMID:27114342
EEG Artifact Removal Using a Wavelet Neural Network
NASA Technical Reports Server (NTRS)
Nguyen, Hoang-Anh T.; Musson, John; Li, Jiang; McKenzie, Frederick; Zhang, Guangfan; Xu, Roger; Richey, Carl; Schnell, Tom
2011-01-01
!n this paper we developed a wavelet neural network. (WNN) algorithm for Electroencephalogram (EEG) artifact removal without electrooculographic (EOG) recordings. The algorithm combines the universal approximation characteristics of neural network and the time/frequency property of wavelet. We. compared the WNN algorithm with .the ICA technique ,and a wavelet thresholding method, which was realized by using the Stein's unbiased risk estimate (SURE) with an adaptive gradient-based optimal threshold. Experimental results on a driving test data set show that WNN can remove EEG artifacts effectively without diminishing useful EEG information even for very noisy data.
An improved artifact removal in exposure fusion with local linear constraints
NASA Astrophysics Data System (ADS)
Zhang, Hai; Yu, Mali
2018-04-01
In exposure fusion, it is challenging to remove artifacts because of camera motion and moving objects in the scene. An improved artifact removal method is proposed in this paper, which performs local linear adjustment in artifact removal progress. After determining a reference image, we first perform high-dynamic-range (HDR) deghosting to generate an intermediate image stack from the input image stack. Then, a linear Intensity Mapping Function (IMF) in each window is extracted based on the intensities of intermediate image and reference image, the intensity mean and variance of reference image. Finally, with the extracted local linear constraints, we reconstruct a target image stack, which can be directly used for fusing a single HDR-like image. Some experiments have been implemented and experimental results demonstrate that the proposed method is robust and effective in removing artifacts especially in the saturated regions of the reference image.
Ring artifact reduction in synchrotron x-ray tomography through helical acquisition
NASA Astrophysics Data System (ADS)
Pelt, Daniël M.; Parkinson, Dilworth Y.
2018-03-01
In synchrotron x-ray tomography, systematic defects in certain detector elements can result in arc-shaped artifacts in the final reconstructed image of the scanned sample. These ring artifacts are commonly found in many applications of synchrotron tomography, and can make it difficult or impossible to use the reconstructed image in further analyses. The severity of ring artifacts is often reduced in practice by applying pre-processing on the acquired data, or post-processing on the reconstructed image. However, such additional processing steps can introduce additional artifacts as well, and rely on specific choices of hyperparameter values. In this paper, a different approach to reducing the severity of ring artifacts is introduced: a helical acquisition mode. By moving the sample parallel to the rotation axis during the experiment, the sample is detected at different detector positions in each projection, reducing the effect of systematic errors in detector elements. Alternatively, helical acquisition can be viewed as a way to transform ring artifacts to helix-like artifacts in the reconstructed volume, reducing their severity. We show that data acquired with the proposed mode can be transformed to data acquired with a virtual circular trajectory, enabling further processing of the data with existing software packages for circular data. Results for both simulated data and experimental data show that the proposed method is able to significantly reduce ring artifacts in practice, even compared with popular existing methods, without introducing additional artifacts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Dong Sik; Lee, Sanggyun
2013-06-15
Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequenciesmore » are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.« less
Reduction of metal artifacts in x-ray CT images using a convolutional neural network
NASA Astrophysics Data System (ADS)
Zhang, Yanbo; Chu, Ying; Yu, Hengyong
2017-09-01
Patients usually contain various metallic implants (e.g. dental fillings, prostheses), causing severe artifacts in the x-ray CT images. Although a large number of metal artifact reduction (MAR) methods have been proposed in the past four decades, MAR is still one of the major problems in clinical x-ray CT. In this work, we develop a convolutional neural network (CNN) based MAR framework, which combines the information from the original and corrected images to suppress artifacts. Before the MAR, we generate a group of data and train a CNN. First, we numerically simulate various metal artifacts cases and build a dataset, which includes metal-free images (used as references), metal-inserted images and various MAR methods corrected images. Then, ten thousands patches are extracted from the databased to train the metal artifact reduction CNN. In the MAR stage, the original image and two corrected images are stacked as a three-channel input image for CNN, and a CNN image is generated with less artifacts. The water equivalent regions in the CNN image are set to a uniform value to yield a CNN prior, whose forward projections are used to replace the metal affected projections, followed by the FBP reconstruction. Experimental results demonstrate the superior metal artifact reduction capability of the proposed method to its competitors.
Lan, Gao; Yunmin, Lian; Pu, Wang; Haili, Huai
2016-06-01
This study aimed to observe and evaluate six 3.0 T sequences of metallic artifacts produced by metal dental crowns. Dental crowns fabricated with four different materials (Co-Gr, Ni-Gr, Ti alloy and pure Ti) were evaluated. A mature crossbreed dog was used as the experimental animal, and crowns were fabricated for its upper right second premolar. Each crown was examined through head MRI (3.0 T) with six sequences, namely, T₁ weighted-imaging of spin echo (T₁W/SE), T₂ weighted-imaging of inversion recovery (T₂W/IR), T₂ star gradient echo (T₂*/GRE), T2 weighted-imaging of fast spin echo (T₂W/FSE), T₁ weighted-imaging of fluid attenuate inversion recovery (T₂W/FLAIR), and T₂ weighted-imaging of propeller (T₂W/PROP). The largest area and layers of artifacts were assessed and compared. The artifact in the T₂*/GRE sequence was significantly wider than those in the other sequences (P < 0.01), whose artifact extent was not significantly different (P > 0.05). T₂*/GRE exhibit the strongest influence on the artifact, whereas the five other sequences contribute equally to artifact generation.
Mannan, Malik M. Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M. Ahmad
2016-01-01
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data. PMID:26907276
Explanation and inference: mechanistic and functional explanations guide property generalization.
Lombrozo, Tania; Gwynne, Nicholas Z
2014-01-01
The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.
Possible artifacts in inferring seismic properties from X-ray data
NASA Astrophysics Data System (ADS)
Bosak, A.; Krisch, M.; Chumakov, A.; Abrikosov, I. A.; Dubrovinsky, L.
2016-11-01
We consider the experimental and computational artifacts relevant for the extraction of aggregate elastic properties of polycrystalline materials with particular emphasis on the derivation of seismic velocities. We use the case of iron as an example, and show that the improper use of definitions and neglecting the crystalline anisotropy can result in unexpectedly large errors up to a few percent.
BlackOPs: increasing confidence in variant detection through mappability filtering.
Cabanski, Christopher R; Wilkerson, Matthew D; Soloway, Matthew; Parker, Joel S; Liu, Jinze; Prins, Jan F; Marron, J S; Perou, Charles M; Hayes, D Neil
2013-10-01
Identifying variants using high-throughput sequencing data is currently a challenge because true biological variants can be indistinguishable from technical artifacts. One source of technical artifact results from incorrectly aligning experimentally observed sequences to their true genomic origin ('mismapping') and inferring differences in mismapped sequences to be true variants. We developed BlackOPs, an open-source tool that simulates experimental RNA-seq and DNA whole exome sequences derived from the reference genome, aligns these sequences by custom parameters, detects variants and outputs a blacklist of positions and alleles caused by mismapping. Blacklists contain thousands of artifact variants that are indistinguishable from true variants and, for a given sample, are expected to be almost completely false positives. We show that these blacklist positions are specific to the alignment algorithm and read length used, and BlackOPs allows users to generate a blacklist specific to their experimental setup. We queried the dbSNP and COSMIC variant databases and found numerous variants indistinguishable from mapping errors. We demonstrate how filtering against blacklist positions reduces the number of potential false variants using an RNA-seq glioblastoma cell line data set. In summary, accounting for mapping-caused variants tuned to experimental setups reduces false positives and, therefore, improves genome characterization by high-throughput sequencing.
Epstein–Barr Virus in Gliomas: Cause, Association, or Artifact?
Akhtar, Saghir; Vranic, Semir; Cyprian, Farhan Sachal; Al Moustafa, Ala-Eddin
2018-01-01
Gliomas are the most common malignant brain tumors and account for around 60% of all primary central nervous system cancers. Glioblastoma multiforme (GBM) is a grade IV glioma associated with a poor outcome despite recent advances in chemotherapy. The etiology of gliomas is unknown, but neurotropic viruses including the Epstein–Barr virus (EBV) that is transmitted via salivary and genital fluids have been implicated recently. EBV is a member of the gamma herpes simplex family of DNA viruses that is known to cause infectious mononucleosis (glandular fever) and is strongly linked with the oncogenesis of several cancers, including B-cell lymphomas, nasopharyngeal, and gastric carcinomas. The fact that EBV is thought to be the causative agent for primary central nervous system (CNS) lymphomas in immune-deficient patients has led to its investigations in other brain tumors including gliomas. Here, we provide a review of the clinical literature pertaining to EBV in gliomas and discuss the possibilities of this virus being simply associative, causative, or even an experimental artifact. We searched the PubMed/MEDLINE databases using the following key words such as: glioma(s), glioblastoma multiforme, brain tumors/cancers, EBV, and neurotropic viruses. Our literature analysis indicates conflicting results on the presence and role of EBV in gliomas. Further comprehensive studies are needed to fully implicate EBV in gliomagenesis and oncomodulation. Understanding the role of EBV and other oncoviruses in the etiology of gliomas, would likely open up new avenues for the treatment and management of these, often fatal, CNS tumors. PMID:29732319
NASA Astrophysics Data System (ADS)
Yang, Fanlin; Zhao, Chunxia; Zhang, Kai; Feng, Chengkai; Ma, Yue
2017-07-01
Acoustic seafloor classification with multibeam backscatter measurements is an attractive approach for mapping seafloor properties over a large area. However, artifacts in the multibeam backscatter measurements prevent accurate characterization of the seafloor. In particular, the backscatter level is extremely strong and highly variable in the near-nadir region due to the specular echo phenomenon. Consequently, striped artifacts emerge in the backscatter image, which can degrade the classification accuracy. This study focuses on the striped artifacts in multibeam backscatter images. To this end, a calibration algorithm based on equal mean-variance fitting is developed. By fitting the local shape of the angular response curve, the striped artifacts are compressed and moved according to the relations between the mean and variance in the near-nadir and off-nadir region. The algorithm utilized the measured data of near-nadir region and retained the basic shape of the response curve. The experimental results verify the high performance of the proposed method.
Correction of data truncation artifacts in differential phase contrast (DPC) tomosynthesis imaging
NASA Astrophysics Data System (ADS)
Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong
2015-10-01
The use of grating based Talbot-Lau interferometry permits the acquisition of differential phase contrast (DPC) imaging with a conventional medical x-ray source and detector. However, due to the limited area of the gratings, limited area of the detector, or both, data truncation image artifacts are often observed in tomographic DPC acquisitions and reconstructions, such as tomosynthesis (limited-angle tomography). When data are truncated in the conventional x-ray absorption tomosynthesis imaging, a variety of methods have been developed to mitigate the truncation artifacts. However, the same strategies used to mitigate absorption truncation artifacts do not yield satisfactory reconstruction results in DPC tomosynthesis reconstruction. In this work, several new methods have been proposed to mitigate data truncation artifacts in a DPC tomosynthesis system. The proposed methods have been validated using experimental data of a mammography accreditation phantom, a bovine udder, as well as several human cadaver breast specimens using a bench-top DPC imaging system at our facility.
Artifacts Of Spectral Analysis Of Instrument Readings
NASA Technical Reports Server (NTRS)
Wise, James H.
1995-01-01
Report presents experimental and theoretical study of some of artifacts introduced by processing outputs of two nominally identical low-frequency-reading instruments; high-sensitivity servo-accelerometers mounted together and operating, in conjunction with signal-conditioning circuits, as seismometers. Processing involved analog-to-digital conversion with anti-aliasing filtering, followed by digital processing including frequency weighting and computation of different measures of power spectral density (PSD).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... objects include 1 lot of fragmented mammal bones; 1 charcoal sample; 1 piece of mussel shell; 1 piece of... objects include 2 pieces of burned mammal bone; 1 burned rodent jaw; 28 pieces of debitage; 8 pipe bowl... other unknown burial numbers. The 658 unassociated funerary artifacts include 1 hollowed bone fragment...
Zuo, Chao; Chen, Qian; Li, Hongru; Qu, Weijuan; Asundi, Anand
2014-07-28
Boundary conditions play a crucial role in the solution of the transport of intensity equation (TIE). If not appropriately handled, they can create significant boundary artifacts across the reconstruction result. In a previous paper [Opt. Express 22, 9220 (2014)], we presented a new boundary-artifact-free TIE phase retrieval method with use of discrete cosine transform (DCT). Here we report its experimental investigations with applications to the micro-optics characterization. The experimental setup is based on a tunable lens based 4f system attached to a non-modified inverted bright-field microscope. We establish inhomogeneous Neumann boundary values by placing a rectangular aperture in the intermediate image plane of the microscope. Then the boundary values are applied to solve the TIE with our DCT-based TIE solver. Experimental results on microlenses highlight the importance of boundary conditions that often overlooked in simplified models, and confirm that our approach effectively avoid the boundary error even when objects are located at the image borders. It is further demonstrated that our technique is non-interferometric, accurate, fast, full-field, and flexible, rendering it a promising metrological tool for the micro-optics inspection.
YÜCEL, MERYEM A.; SELB, JULIETTE; COOPER, ROBERT J.; BOAS, DAVID A.
2014-01-01
As near-infrared spectroscopy (NIRS) broadens its application area to different age and disease groups, motion artifacts in the NIRS signal due to subject movement is becoming an important challenge. Motion artifacts generally produce signal fluctuations that are larger than physiological NIRS signals, thus it is crucial to correct for them before obtaining an estimate of stimulus evoked hemodynamic responses. There are various methods for correction such as principle component analysis (PCA), wavelet-based filtering and spline interpolation. Here, we introduce a new approach to motion artifact correction, targeted principle component analysis (tPCA), which incorporates a PCA filter only on the segments of data identified as motion artifacts. It is expected that this will overcome the issues of filtering desired signals that plagues standard PCA filtering of entire data sets. We compared the new approach with the most effective motion artifact correction algorithms on a set of data acquired simultaneously with a collodion-fixed probe (low motion artifact content) and a standard Velcro probe (high motion artifact content). Our results show that tPCA gives statistically better results in recovering hemodynamic response function (HRF) as compared to wavelet-based filtering and spline interpolation for the Velcro probe. It results in a significant reduction in mean-squared error (MSE) and significant enhancement in Pearson’s correlation coefficient to the true HRF. The collodion-fixed fiber probe with no motion correction performed better than the Velcro probe corrected for motion artifacts in terms of MSE and Pearson’s correlation coefficient. Thus, if the experimental study permits, the use of a collodion-fixed fiber probe may be desirable. If the use of a collodion-fixed probe is not feasible, then we suggest the use of tPCA in the processing of motion artifact contaminated data. PMID:25360181
Putney, Joy; Hilbert, Douglas; Paskaranandavadivel, Niranchan; Cheng, Leo K.; O'Grady, Greg; Angeli, Timothy R.
2016-01-01
Objective The aim of this study was to develop, validate, and apply a fully automated method for reducing large temporally synchronous artifacts present in electrical recordings made from the gastrointestinal (GI) serosa, which are problematic for properly assessing slow wave dynamics. Such artifacts routinely arise in experimental and clinical settings from motion, switching behavior of medical instruments, or electrode array manipulation. Methods A novel iterative COvaraiance-Based Reduction of Artifacts (COBRA) algorithm sequentially reduced artifact waveforms using an updating across-channel median as a noise template, scaled and subtracted from each channel based on their covariance. Results Application of COBRA substantially increased the signal-to-artifact ratio (12.8±2.5 dB), while minimally attenuating the energy of the underlying source signal by 7.9% on average (-11.1±3.9 dB). Conclusion COBRA was shown to be highly effective for aiding recovery and accurate marking of slow wave events (sensitivity = 0.90±0.04; positive-predictive value = 0.74±0.08) from large segments of in vivo porcine GI electrical mapping data that would otherwise be lost due to a broad range of contaminating artifact waveforms. Significance Strongly reducing artifacts with COBRA ultimately allowed for rapid production of accurate isochronal activation maps detailing the dynamics of slow wave propagation in the porcine intestine. Such mapping studies can help characterize differences between normal and dysrhythmic events, which have been associated with GI abnormalities, such as intestinal ischemia and gastroparesis. The COBRA method may be generally applicable for removing temporally synchronous artifacts in other biosignal processing domains. PMID:26829772
An Indoor Navigation System for the Visually Impaired
Guerrero, Luis A.; Vasquez, Francisco; Ochoa, Sergio F.
2012-01-01
Navigation in indoor environments is highly challenging for the severely visually impaired, particularly in spaces visited for the first time. Several solutions have been proposed to deal with this challenge. Although some of them have shown to be useful in real scenarios, they involve an important deployment effort or use artifacts that are not natural for blind users. This paper presents an indoor navigation system that was designed taking into consideration usability as the quality requirement to be maximized. This solution enables one to identify the position of a person and calculates the velocity and direction of his movements. Using this information, the system determines the user's trajectory, locates possible obstacles in that route, and offers navigation information to the user. The solution has been evaluated using two experimental scenarios. Although the results are still not enough to provide strong conclusions, they indicate that the system is suitable to guide visually impaired people through an unknown built environment. PMID:22969398
Artifacts in time-resolved Kelvin probe force microscopy
Sadewasser, Sascha; Nicoara, Nicoleta; Solares, Santiago D.
2018-04-24
Kelvin probe force microscopy (KPFM) has been used for the characterization of metals, insulators, and semiconducting materials on the nanometer scale. Especially in semiconductors, the charge dynamics are of high interest. Recently, several techniques for time-resolved measurements with time resolution down to picoseconds have been developed, many times using a modulated excitation signal, e.g. light modulation or bias modulation that induces changes in the charge carrier distribution. For fast modulation frequencies, the KPFM controller measures an average surface potential, which contains information about the involved charge carrier dynamics. Here, we show that such measurements are prone to artifacts due tomore » frequency mixing, by performing numerical dynamics simulations of the cantilever oscillation in KPFM subjected to a bias-modulated signal. For square bias pulses, the resulting time-dependent electrostatic forces are very complex and result in intricate mixing of frequencies that may, in some cases, have a component at the detection frequency, leading to falsified KPFM measurements. Additionally, we performed fast Fourier transform (FFT) analyses that match the results of the numerical dynamics simulations. Small differences are observed that can be attributed to transients and higher-order Fourier components, as a consequence of the intricate nature of the cantilever driving forces. These results are corroborated by experimental measurements on a model system. In the experimental case, additional artifacts are observed due to constructive or destructive interference of the bias modulation with the cantilever oscillation. Also, in the case of light modulation, we demonstrate artifacts due to unwanted illumination of the photodetector of the beam deflection detection system. Lastly, guidelines for avoiding such artifacts are given.« less
Artifacts in time-resolved Kelvin probe force microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadewasser, Sascha; Nicoara, Nicoleta; Solares, Santiago D.
Kelvin probe force microscopy (KPFM) has been used for the characterization of metals, insulators, and semiconducting materials on the nanometer scale. Especially in semiconductors, the charge dynamics are of high interest. Recently, several techniques for time-resolved measurements with time resolution down to picoseconds have been developed, many times using a modulated excitation signal, e.g. light modulation or bias modulation that induces changes in the charge carrier distribution. For fast modulation frequencies, the KPFM controller measures an average surface potential, which contains information about the involved charge carrier dynamics. Here, we show that such measurements are prone to artifacts due tomore » frequency mixing, by performing numerical dynamics simulations of the cantilever oscillation in KPFM subjected to a bias-modulated signal. For square bias pulses, the resulting time-dependent electrostatic forces are very complex and result in intricate mixing of frequencies that may, in some cases, have a component at the detection frequency, leading to falsified KPFM measurements. Additionally, we performed fast Fourier transform (FFT) analyses that match the results of the numerical dynamics simulations. Small differences are observed that can be attributed to transients and higher-order Fourier components, as a consequence of the intricate nature of the cantilever driving forces. These results are corroborated by experimental measurements on a model system. In the experimental case, additional artifacts are observed due to constructive or destructive interference of the bias modulation with the cantilever oscillation. Also, in the case of light modulation, we demonstrate artifacts due to unwanted illumination of the photodetector of the beam deflection detection system. Lastly, guidelines for avoiding such artifacts are given.« less
Towards the development of high temperature comparison artifacts for radiation thermometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teixeira, R. N.; Machin, G.; Orlando, A.
This paper describes the methodology and first results of the development of high temperature fixed point artifacts of unknown temperature suitable for scale comparison purposes. This study is being undertaken at the Thermal Metrology Division of Inmetro, Brazil, as part of PhD studies. In this initial phase of the study two identical cobalt carbon eutectic cells were constructed and one doped with a known amount of copper. This was an attempt to achieve a controlled change in the transition temperature of the alloy during melting. Copper was chosen due to the relatively simple phase diagram it forms with carbon andmore » cobalt. The cobalt, in powder form, was supplied by Alfa Aesar at 99.998 % purity, and was mixed with carbon powder (1,9 % by weight) of 99.9999 % purity. Complete filling of the crucible took 6 steps and was performed in a vertical furnace with graphite heating elements, in an inert gas atmosphere. The temperature measurements were performed using a KE LP3 radiation thermometer, which was previously evaluated for spectral responsivity, linearity and size-of-source effect (SSE). During these measurements, the thermometer stability was periodically checked using a silver fixed point blackbody maintained in a three zone furnace. The main purpose of the first part of this study is to dope a series of Co-C blackbody with differing amounts of copper, in order to alter their temperatures whilst still retaining good melting plateau performance. The long-term stability of the adjusted transition temperatures will also be investigated. Other dopants will be studied as the research progresses, and thermo chemical modeling will be performed in an attempt to understand the change in temperature with dopant concentration and so help select suitable dopants in the future. The overall objective is to construct comparison artifacts that have good performance, in terms of plateau shape and long-term temperature stability, but with unknown temperatures. These can then be used as comparison artifacts with no participant, except the pilot, knowing the temperature a priori.« less
Removal of EOG Artifacts from EEG Recordings Using Stationary Subspace Analysis
Zeng, Hong; Song, Aiguo
2014-01-01
An effective approach is proposed in this paper to remove ocular artifacts from the raw EEG recording. The proposed approach first conducts the blind source separation on the raw EEG recording by the stationary subspace analysis (SSA) algorithm. Unlike the classic blind source separation algorithms, SSA is explicitly tailored to the understanding of distribution changes, where both the mean and the covariance matrix are taken into account. In addition, neither independency nor uncorrelation is required among the sources by SSA. Thereby, it can concentrate artifacts in fewer components than the representative blind source separation methods. Next, the components that are determined to be related to the ocular artifacts are projected back to be subtracted from EEG signals, producing the clean EEG data eventually. The experimental results on both the artificially contaminated EEG data and real EEG data have demonstrated the effectiveness of the proposed method, in particular for the cases where limited number of electrodes are used for the recording, as well as when the artifact contaminated signal is highly nonstationary and the underlying sources cannot be assumed to be independent or uncorrelated. PMID:24550696
Methods to mitigate data truncation artifacts in multi-contrast tomosynthesis image reconstructions
NASA Astrophysics Data System (ADS)
Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong
2015-03-01
Differential phase contrast imaging is a promising new image modality that utilizes the refraction rather than the absorption of x-rays to image an object. A Talbot-Lau interferometer may be used to permit differential phase contrast imaging with a conventional medical x-ray source and detector. However, the current size of the gratings fabricated for these interferometers are often relatively small. As a result, data truncation image artifacts are often observed in a tomographic acquisition and reconstruction. When data are truncated in x-ray absorption imaging, the methods have been introduced to mitigate the truncation artifacts. However, the same strategy to mitigate absorption truncation artifacts may not be appropriate for differential phase contrast or dark field tomographic imaging. In this work, several new methods to mitigate data truncation artifacts in a multi-contrast imaging system have been proposed and evaluated for tomosynthesis data acquisitions. The proposed methods were validated using experimental data acquired for a bovine udder as well as several cadaver breast specimens using a benchtop system at our facility.
Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.
ERIC Educational Resources Information Center
Fortune, Jim C.; Hutson, Barbara A.
1984-01-01
Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…
Stimulation artifact correction method for estimation of early cortico-cortical evoked potentials.
Trebaul, Lena; Rudrauf, David; Job, Anne-Sophie; Mălîia, Mihai Dragos; Popa, Irina; Barborica, Andrei; Minotti, Lorella; Mîndruţă, Ioana; Kahane, Philippe; David, Olivier
2016-05-01
Effective connectivity can be explored using direct electrical stimulations in patients suffering from drug-resistant focal epilepsies and investigated with intracranial electrodes. Responses to brief electrical pulses mimic the physiological propagation of signals and manifest as cortico-cortical evoked potentials (CCEP). The first CCEP component is believed to reflect direct connectivity with the stimulated region but the stimulation artifact, a sharp deflection occurring during a few milliseconds, frequently contaminates it. In order to recover the characteristics of early CCEP responses, we developed an artifact correction method based on electrical modeling of the electrode-tissue interface. The biophysically motivated artifact templates are then regressed out of the recorded data as in any classical template-matching removal artifact methods. Our approach is able to make the distinction between the physiological responses time-locked to the stimulation pulses and the non-physiological component. We tested the correction on simulated CCEP data in order to quantify its efficiency for different stimulation and recording parameters. We demonstrated the efficiency of the new correction method on simulations of single trial recordings for early responses contaminated with the stimulation artifact. The results highlight the importance of sampling frequency for an accurate analysis of CCEP. We then applied the approach to experimental data. The model-based template removal was compared to a correction based on the subtraction of the averaged artifact. This new correction method of stimulation artifact will enable investigators to better analyze early CCEP components and infer direct effective connectivity in future CCEP studies. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Optical nano artifact metrics using silicon random nanostructures
NASA Astrophysics Data System (ADS)
Matsumoto, Tsutomu; Yoshida, Naoki; Nishio, Shumpei; Hoga, Morihisa; Ohyagi, Yasuyuki; Tate, Naoya; Naruse, Makoto
2016-08-01
Nano-artifact metrics exploit unique physical attributes of nanostructured matter for authentication and clone resistance, which is vitally important in the age of Internet-of-Things where securing identities is critical. However, expensive and huge experimental apparatuses, such as scanning electron microscopy, have been required in the former studies. Herein, we demonstrate an optical approach to characterise the nanoscale-precision signatures of silicon random structures towards realising low-cost and high-value information security technology. Unique and versatile silicon nanostructures are generated via resist collapse phenomena, which contains dimensions that are well below the diffraction limit of light. We exploit the nanoscale precision ability of confocal laser microscopy in the height dimension; our experimental results demonstrate that the vertical precision of measurement is essential in satisfying the performances required for artifact metrics. Furthermore, by using state-of-the-art nanostructuring technology, we experimentally fabricate clones from the genuine devices. We demonstrate that the statistical properties of the genuine and clone devices are successfully exploited, showing that the liveness-detection-type approach, which is widely deployed in biometrics, is valid in artificially-constructed solid-state nanostructures. These findings pave the way for reasonable and yet sufficiently secure novel principles for information security based on silicon random nanostructures and optical technologies.
Earliest stone-tipped projectiles from the Ethiopian rift date to >279,000 years ago.
Sahle, Yonatan; Hutchings, W Karl; Braun, David R; Sealy, Judith C; Morgan, Leah E; Negash, Agazi; Atnafu, Balemwal
2013-01-01
Projectile weapons (i.e. those delivered from a distance) enhanced prehistoric hunting efficiency by enabling higher impact delivery and hunting of a broader range of animals while reducing confrontations with dangerous prey species. Projectiles therefore provided a significant advantage over thrusting spears. Composite projectile technologies are considered indicative of complex behavior and pivotal to the successful spread of Homo sapiens. Direct evidence for such projectiles is thus far unknown from >80,000 years ago. Data from velocity-dependent microfracture features, diagnostic damage patterns, and artifact shape reported here indicate that pointed stone artifacts from Ethiopia were used as projectile weapons (in the form of hafted javelin tips) as early as >279,000 years ago. In combination with the existing archaeological, fossil and genetic evidence, these data isolate eastern Africa as a source of modern cultures and biology.
Terahertz Absorption by Cellulose: Application to Ancient Paper Artifacts
NASA Astrophysics Data System (ADS)
Peccianti, M.; Fastampa, R.; Mosca Conte, A.; Pulci, O.; Violante, C.; Łojewska, J.; Clerici, M.; Morandotti, R.; Missori, M.
2017-06-01
Artifacts made of cellulose, such as ancient documents, pose a significant experimental challenge in the terahertz transmission spectra interpretation due to their small optical thickness. In this paper, we describe a method to recover the complex refractive index of cellulose fibers from the terahertz transmission data obtained on single freely standing paper sheets in the (0.2-3.5)-THz range. By using our technique, we eliminate Fabry-Perot effects and recover the absorption coefficient of the cellulose fibers. The obtained terahertz absorption spectra are explained in terms of absorption peaks of the cellulose crystalline phase superimposed to a background contribution due to a disordered hydrogen-bond network. The comparison between the experimental spectra with terahertz vibrational properties simulated by density-functional-theory calculations confirms this interpretation. In addition, evident changes in the terahertz absorption spectra are produced by natural and artificial aging on paper samples, whose final stage is characterized by a spectral profile with only two peaks at about 2.1 and 3.1 THz. These results can be used to provide a quantitative assessment of the state of preservation of cellulose artifacts.
Preventing probe induced topography correlated artifacts in Kelvin Probe Force Microscopy.
Polak, Leo; Wijngaarden, Rinke J
2016-12-01
Kelvin Probe Force Microscopy (KPFM) on samples with rough surface topography can be hindered by topography correlated artifacts. We show that, with the proper experimental configuration and using homogeneously metal coated probes, we are able to obtain amplitude modulation (AM) KPFM results on a gold coated sample with rough topography that are free from such artifacts. By inducing tip inhomogeneity through contact with the sample, clear potential variations appear in the KPFM image, which correlate with the surface topography and, thus, are probe induced artifacts. We find that switching to frequency modulation (FM) KPFM with such altered probes does not remove these artifacts. We also find that the induced tip inhomogeneity causes a lift height dependence of the KPFM measurement, which can therefore be used as a check for the presence of probe induced topography correlated artifacts. We attribute the observed effects to a work function difference between the tip and the rest of the probe and describe a model for such inhomogeneous probes that predicts lift height dependence and topography correlated artifacts for both AM and FM-KPFM methods. This work demonstrates that using a probe with a homogeneous work function and preventing tip changes is essential for KPFM on non-flat samples. From the three investigated probe coatings, PtIr, Au and TiN, the latter appears to be the most suitable, because of its better resistance against coating damage. Copyright © 2016 Elsevier B.V. All rights reserved.
Single Channel EEG Artifact Identification Using Two-Dimensional Multi-Resolution Analysis.
Taherisadr, Mojtaba; Dehzangi, Omid; Parsaei, Hossein
2017-12-13
As a diagnostic monitoring approach, electroencephalogram (EEG) signals can be decoded by signal processing methodologies for various health monitoring purposes. However, EEG recordings are contaminated by other interferences, particularly facial and ocular artifacts generated by the user. This is specifically an issue during continuous EEG recording sessions, and is therefore a key step in using EEG signals for either physiological monitoring and diagnosis or brain-computer interface to identify such artifacts from useful EEG components. In this study, we aim to design a new generic framework in order to process and characterize EEG recording as a multi-component and non-stationary signal with the aim of localizing and identifying its component (e.g., artifact). In the proposed method, we gather three complementary algorithms together to enhance the efficiency of the system. Algorithms include time-frequency (TF) analysis and representation, two-dimensional multi-resolution analysis (2D MRA), and feature extraction and classification. Then, a combination of spectro-temporal and geometric features are extracted by combining key instantaneous TF space descriptors, which enables the system to characterize the non-stationarities in the EEG dynamics. We fit a curvelet transform (as a MRA method) to 2D TF representation of EEG segments to decompose the given space to various levels of resolution. Such a decomposition efficiently improves the analysis of the TF spaces with different characteristics (e.g., resolution). Our experimental results demonstrate that the combination of expansion to TF space, analysis using MRA, and extracting a set of suitable features and applying a proper predictive model is effective in enhancing the EEG artifact identification performance. We also compare the performance of the designed system with another common EEG signal processing technique-namely, 1D wavelet transform. Our experimental results reveal that the proposed method outperforms 1D wavelet.
Scene-based nonuniformity correction with reduced ghosting using a gated LMS algorithm.
Hardie, Russell C; Baxley, Frank; Brys, Brandon; Hytla, Patrick
2009-08-17
In this paper, we present a scene-based nouniformity correction (NUC) method using a modified adaptive least mean square (LMS) algorithm with a novel gating operation on the updates. The gating is designed to significantly reduce ghosting artifacts produced by many scene-based NUC algorithms by halting updates when temporal variation is lacking. We define the algorithm and present a number of experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published methods including other LMS and constant statistics based methods. The experimental results include simulated imagery and a real infrared image sequence. We show that the proposed method significantly reduces ghosting artifacts, but has a slightly longer convergence time. (c) 2009 Optical Society of America
A leaded apron for use in panoramic dental radiography.
Whitcher, B L; Gratt, B M; Sickles, E A
1980-05-01
The leaded aprons currently available for use during dental radiography do not protect the thyroid gland from radiation. Conventional aprons may produce artifacts when used with panoramic dental x-ray units. This study measures the dose reduction obtained with an experimental leaded apron designed for use with panoramic dental x-ray units. Skin exposures measured at the thyroid and at the sternum were reduced with the use of the apron. Films produced during the study were free from apron artifacts.
Mina, Faten; Attina, Virginie; Duroc, Yvan; Veuillet, Evelyne; Truy, Eric; Thai-Van, Hung
2017-01-01
Auditory steady state responses (ASSRs) in cochlear implant (CI) patients are contaminated by the spread of a continuous CI electrical stimulation artifact. The aim of this work was to model the electrophysiological mixture of the CI artifact and the corresponding evoked potentials on scalp electrodes in order to evaluate the performance of denoising algorithms in eliminating the CI artifact in a controlled environment. The basis of the proposed computational framework is a neural mass model representing the nodes of the auditory pathways. Six main contributors to auditory evoked potentials from the cochlear level and up to the auditory cortex were taken into consideration. The simulated dynamics were then projected into a 3-layer realistic head model. 32-channel scalp recordings of the CI artifact-response were then generated by solving the electromagnetic forward problem. As an application, the framework’s simulated 32-channel datasets were used to compare the performance of 4 commonly used Independent Component Analysis (ICA) algorithms: infomax, extended infomax, jade and fastICA in eliminating the CI artifact. As expected, two major components were detectable in the simulated datasets, a low frequency component at the modulation frequency and a pulsatile high frequency component related to the stimulation frequency. The first can be attributed to the phase-locked ASSR and the second to the stimulation artifact. Among the ICA algorithms tested, simulations showed that infomax was the most efficient and reliable in denoising the CI artifact-response mixture. Denoising algorithms can induce undesirable deformation of the signal of interest in real CI patient recordings. The proposed framework is a valuable tool for evaluating these algorithms in a controllable environment ahead of experimental or clinical applications. PMID:28350887
Mina, Faten; Attina, Virginie; Duroc, Yvan; Veuillet, Evelyne; Truy, Eric; Thai-Van, Hung
2017-01-01
Auditory steady state responses (ASSRs) in cochlear implant (CI) patients are contaminated by the spread of a continuous CI electrical stimulation artifact. The aim of this work was to model the electrophysiological mixture of the CI artifact and the corresponding evoked potentials on scalp electrodes in order to evaluate the performance of denoising algorithms in eliminating the CI artifact in a controlled environment. The basis of the proposed computational framework is a neural mass model representing the nodes of the auditory pathways. Six main contributors to auditory evoked potentials from the cochlear level and up to the auditory cortex were taken into consideration. The simulated dynamics were then projected into a 3-layer realistic head model. 32-channel scalp recordings of the CI artifact-response were then generated by solving the electromagnetic forward problem. As an application, the framework's simulated 32-channel datasets were used to compare the performance of 4 commonly used Independent Component Analysis (ICA) algorithms: infomax, extended infomax, jade and fastICA in eliminating the CI artifact. As expected, two major components were detectable in the simulated datasets, a low frequency component at the modulation frequency and a pulsatile high frequency component related to the stimulation frequency. The first can be attributed to the phase-locked ASSR and the second to the stimulation artifact. Among the ICA algorithms tested, simulations showed that infomax was the most efficient and reliable in denoising the CI artifact-response mixture. Denoising algorithms can induce undesirable deformation of the signal of interest in real CI patient recordings. The proposed framework is a valuable tool for evaluating these algorithms in a controllable environment ahead of experimental or clinical applications.
NASA Astrophysics Data System (ADS)
Kim, Juhye; Nam, Haewon; Lee, Rena
2015-07-01
CT (computed tomography) images, metal materials such as tooth supplements or surgical clips can cause metal artifact and degrade image quality. In severe cases, this may lead to misdiagnosis. In this research, we developed a new MAR (metal artifact reduction) algorithm by using an edge preserving filter and the MATLAB program (Mathworks, version R2012a). The proposed algorithm consists of 6 steps: image reconstruction from projection data, metal segmentation, forward projection, interpolation, applied edge preserving smoothing filter, and new image reconstruction. For an evaluation of the proposed algorithm, we obtained both numerical simulation data and data for a Rando phantom. In the numerical simulation data, four metal regions were added into the Shepp Logan phantom for metal artifacts. The projection data of the metal-inserted Rando phantom were obtained by using a prototype CBCT scanner manufactured by medical engineering and medical physics (MEMP) laboratory research group in medical science at Ewha Womans University. After these had been adopted the proposed algorithm was performed, and the result were compared with the original image (with metal artifact without correction) and with a corrected image based on linear interpolation. Both visual and quantitative evaluations were done. Compared with the original image with metal artifacts and with the image corrected by using linear interpolation, both the numerical and the experimental phantom data demonstrated that the proposed algorithm reduced the metal artifact. In conclusion, the evaluation in this research showed that the proposed algorithm outperformed the interpolation based MAR algorithm. If an optimization and a stability evaluation of the proposed algorithm can be performed, the developed algorithm is expected to be an effective tool for eliminating metal artifacts even in commercial CT systems.
NASA Astrophysics Data System (ADS)
Chen, Siyu; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Han, Yu; Yan, Bin
2016-10-01
X-ray computed tomography (CT) has been extensively applied in industrial non-destructive testing (NDT). However, in practical applications, the X-ray beam polychromaticity often results in beam hardening problems for image reconstruction. The beam hardening artifacts, which manifested as cupping, streaks and flares, not only debase the image quality, but also disturb the subsequent analyses. Unfortunately, conventional CT scanning requires that the scanned object is completely covered by the field of view (FOV), the state-of-art beam hardening correction methods only consider the ideal scanning configuration, and often suffer problems for interior tomography due to the projection truncation. Aiming at this problem, this paper proposed a beam hardening correction method based on radon inversion transform for interior tomography. Experimental results show that, compared to the conventional correction algorithms, the proposed approach has achieved excellent performance in both beam hardening artifacts reduction and truncation artifacts suppression. Therefore, the presented method has vitally theoretic and practicable meaning in artifacts correction of industrial CT.
NASA Astrophysics Data System (ADS)
Dietlicher, Isabelle; Casiraghi, Margherita; Ares, Carmen; Bolsi, Alessandra; Weber, Damien C.; Lomax, Antony J.; Albertini, Francesca
2014-12-01
To investigate the effect of metal implants in proton radiotherapy, dose distributions of different, clinically relevant treatment plans have been measured in an anthropomorphic phantom and compared to treatment planning predictions. The anthropomorphic phantom, which is sliced into four segments in the cranio-caudal direction, is composed of tissue equivalent materials and contains a titanium implant in a vertebral body in the cervical region. GafChromic® films were laid between the different segments to measure the 2D delivered dose. Three different four-field plans have then been applied: a Single-Field-Uniform-Dose (SFUD) plan, both with and without artifact correction implemented, and an Intensity-Modulated-Proton-Therapy (IMPT) plan with the artifacts corrected. For corrections, the artifacts were manually outlined and the Hounsfield Units manually set to an average value for soft tissue. Results show a surprisingly good agreement between prescribed and delivered dose distributions when artifacts have been corrected, with > 97% and 98% of points fulfilling the gamma criterion of 3%/3 mm for both SFUD and the IMPT plans, respectively. In contrast, without artifact corrections, up to 18% of measured points fail the gamma criterion of 3%/3 mm for the SFUD plan. These measurements indicate that correcting manually for the reconstruction artifacts resulting from metal implants substantially improves the accuracy of the calculated dose distribution.
Characterization of pigments and colors used in ancient Egyptian boat models
NASA Astrophysics Data System (ADS)
Hühnerfuβ, Katja; von Bohlen, Alex; Kurth, Dieter
2006-11-01
The analyses of pigments originating from well dated ancient boat models found in Egyptian graves were used for characterization and for dating tasks of unknown objects. A nearly destruction free sampling technique using cotton buds was applied for sampling these valuable artifacts for a subsequent Total Reflection X-Ray Fluorescence Spectrometry (TXRF) analysis. Two relevant collections of Egyptian object of art were at our disposal, one of the Ägyptisches Museum Berlin and the second of the British Museum London. Three groups of colors were studied, they originate from white, red and blue/green paints, respectively. The results of the analyses performed on micro-amounts of paints (< 1 μg) show that some artifacts were misclassified and belong to other epochs. Some others were retouched with modern colors. In general, it can be stated that results obtained by Total Reflection X-Ray Fluorescence Spectrometry may dissipate some uncertainties when applying classical archaeological dating methods.
Earliest Stone-Tipped Projectiles from the Ethiopian Rift Date to >279,000 Years Ago
Sahle, Yonatan; Hutchings, W. Karl; Braun, David R.; Sealy, Judith C.; Morgan, Leah E.; Negash, Agazi; Atnafu, Balemwal
2013-01-01
Projectile weapons (i.e. those delivered from a distance) enhanced prehistoric hunting efficiency by enabling higher impact delivery and hunting of a broader range of animals while reducing confrontations with dangerous prey species. Projectiles therefore provided a significant advantage over thrusting spears. Composite projectile technologies are considered indicative of complex behavior and pivotal to the successful spread of Homo sapiens. Direct evidence for such projectiles is thus far unknown from >80,000 years ago. Data from velocity-dependent microfracture features, diagnostic damage patterns, and artifact shape reported here indicate that pointed stone artifacts from Ethiopia were used as projectile weapons (in the form of hafted javelin tips) as early as >279,000 years ago. In combination with the existing archaeological, fossil and genetic evidence, these data isolate eastern Africa as a source of modern cultures and biology. PMID:24236011
A robust adaptive denoising framework for real-time artifact removal in scalp EEG measurements
NASA Astrophysics Data System (ADS)
Kilicarslan, Atilla; Grossman, Robert G.; Contreras-Vidal, Jose Luis
2016-04-01
Objective. Non-invasive measurement of human neural activity based on the scalp electroencephalogram (EEG) allows for the development of biomedical devices that interface with the nervous system for scientific, diagnostic, therapeutic, or restorative purposes. However, EEG recordings are often considered as prone to physiological and non-physiological artifacts of different types and frequency characteristics. Among them, ocular artifacts and signal drifts represent major sources of EEG contamination, particularly in real-time closed-loop brain-machine interface (BMI) applications, which require effective handling of these artifacts across sessions and in natural settings. Approach. We extend the usage of a robust adaptive noise cancelling (ANC) scheme ({H}∞ filtering) for removal of eye blinks, eye motions, amplitude drifts and recording biases simultaneously. We also characterize the volume conduction, by estimating the signal propagation levels across all EEG scalp recording areas due to ocular artifact generators. We find that the amplitude and spatial distribution of ocular artifacts vary greatly depending on the electrode location. Therefore, fixed filtering parameters for all recording areas would naturally hinder the true overall performance of an ANC scheme for artifact removal. We treat each electrode as a separate sub-system to be filtered, and without the loss of generality, they are assumed to be uncorrelated and uncoupled. Main results. Our results show over 95-99.9% correlation between the raw and processed signals at non-ocular artifact regions, and depending on the contamination profile, 40-70% correlation when ocular artifacts are dominant. We also compare our results with the offline independent component analysis and artifact subspace reconstruction methods, and show that some local quantities are handled better by our sample-adaptive real-time framework. Decoding performance is also compared with multi-day experimental data from 2 subjects, totaling 19 sessions, with and without {H}∞ filtering of the raw data. Significance. The proposed method allows real-time adaptive artifact removal for EEG-based closed-loop BMI applications and mobile EEG studies in general, thereby increasing the range of tasks that can be studied in action and context while reducing the need for discarding data due to artifacts. Significant increase in decoding performances also justify the effectiveness of the method to be used in real-time closed-loop BMI applications.
Evidence Based Assessment of Public Health Planning: A Case Study of the 2014 Crisis in Ukraine
2015-06-12
Unknowns, Unknown Unknowns and the Propagation of Scientific Enquiry,” Journal of Experimental Botany 60, no. 3 (March 2009): 712-714. Risk...David C. Logan, “Known Knowns, Known Unknowns, Unknown Unknowns and the Propagation of Scientific Enquiry,” Journal of Experimental Botany 60, no. 3...Experimental Botany 60, no. 3 (March 2009): 712-714. Markel, Howard. “Facing Tuberculosis,” When Germs Travel: Six Major Epidemics That Have Invaded America
Raw data normalization for a multi source inverse geometry CT system
Baek, Jongduk; De Man, Bruno; Harrison, Daniel; Pelc, Norbert J.
2015-01-01
A multi-source inverse-geometry CT (MS-IGCT) system consists of a small 2D detector array and multiple x-ray sources. During data acquisition, each source is activated sequentially, and may have random source intensity fluctuations relative to their respective nominal intensity. While a conventional 3rd generation CT system uses a reference channel to monitor the source intensity fluctuation, the MS-IGCT system source illuminates a small portion of the entire field-of-view (FOV). Therefore, it is difficult for all sources to illuminate the reference channel and the projection data computed by standard normalization using flat field data of each source contains error and can cause significant artifacts. In this work, we present a raw data normalization algorithm to reduce the image artifacts caused by source intensity fluctuation. The proposed method was tested using computer simulations with a uniform water phantom and a Shepp-Logan phantom, and experimental data of an ice-filled PMMA phantom and a rabbit. The effect on image resolution and robustness of the noise were tested using MTF and standard deviation of the reconstructed noise image. With the intensity fluctuation and no correction, reconstructed images from simulation and experimental data show high frequency artifacts and ring artifacts which are removed effectively using the proposed method. It is also observed that the proposed method does not degrade the image resolution and is very robust to the presence of noise. PMID:25837090
Clutter Mitigation in Echocardiography Using Sparse Signal Separation
Yavneh, Irad
2015-01-01
In ultrasound imaging, clutter artifacts degrade images and may cause inaccurate diagnosis. In this paper, we apply a method called Morphological Component Analysis (MCA) for sparse signal separation with the objective of reducing such clutter artifacts. The MCA approach assumes that the two signals in the additive mix have each a sparse representation under some dictionary of atoms (a matrix), and separation is achieved by finding these sparse representations. In our work, an adaptive approach is used for learning the dictionary from the echo data. MCA is compared to Singular Value Filtering (SVF), a Principal Component Analysis- (PCA-) based filtering technique, and to a high-pass Finite Impulse Response (FIR) filter. Each filter is applied to a simulated hypoechoic lesion sequence, as well as experimental cardiac ultrasound data. MCA is demonstrated in both cases to outperform the FIR filter and obtain results comparable to the SVF method in terms of contrast-to-noise ratio (CNR). Furthermore, MCA shows a lower impact on tissue sections while removing the clutter artifacts. In experimental heart data, MCA obtains in our experiments clutter mitigation with an average CNR improvement of 1.33 dB. PMID:26199622
Artifact suppression and analysis of brain activities with electroencephalography signals.
Rashed-Al-Mahfuz, Md; Islam, Md Rabiul; Hirose, Keikichi; Molla, Md Khademul Islam
2013-06-05
Brain-computer interface is a communication system that connects the brain with computer (or other devices) but is not dependent on the normal output of the brain (i.e., peripheral nerve and muscle). Electro-oculogram is a dominant artifact which has a significant negative influence on further analysis of real electroencephalography data. This paper presented a data adaptive technique for artifact suppression and brain wave extraction from electroencephalography signals to detect regional brain activities. Empirical mode decomposition based adaptive thresholding approach was employed here to suppress the electro-oculogram artifact. Fractional Gaussian noise was used to determine the threshold level derived from the analysis data without any training. The purified electroencephalography signal was composed of the brain waves also called rhythmic components which represent the brain activities. The rhythmic components were extracted from each electroencephalography channel using adaptive wiener filter with the original scale. The regional brain activities were mapped on the basis of the spatial distribution of rhythmic components, and the results showed that different regions of the brain are activated in response to different stimuli. This research analyzed the activities of a single rhythmic component, alpha with respect to different motor imaginations. The experimental results showed that the proposed method is very efficient in artifact suppression and identifying individual motor imagery based on the activities of alpha component.
Negligible Motion Artifacts in Scalp Electroencephalography (EEG) During Treadmill Walking.
Nathan, Kevin; Contreras-Vidal, Jose L
2015-01-01
Recent mobile brain/body imaging (MoBI) techniques based on active electrode scalp electroencephalogram (EEG) allow the acquisition and real-time analysis of brain dynamics during active unrestrained motor behavior involving whole body movements such as treadmill walking, over-ground walking and other locomotive and non-locomotive tasks. Unfortunately, MoBI protocols are prone to physiological and non-physiological artifacts, including motion artifacts that may contaminate the EEG recordings. A few attempts have been made to quantify these artifacts during locomotion tasks but with inconclusive results due in part to methodological pitfalls. In this paper, we investigate the potential contributions of motion artifacts in scalp EEG during treadmill walking at three different speeds (1.5, 3.0, and 4.5 km/h) using a wireless 64 channel active EEG system and a wireless inertial sensor attached to the subject's head. The experimental setup was designed according to good measurement practices using state-of-the-art commercially available instruments, and the measurements were analyzed using Fourier analysis and wavelet coherence approaches. Contrary to prior claims, the subjects' motion did not significantly affect their EEG during treadmill walking although precaution should be taken when gait speeds approach 4.5 km/h. Overall, these findings suggest how MoBI methods may be safely deployed in neural, cognitive, and rehabilitation engineering applications.
An adaptive singular spectrum analysis method for extracting brain rhythms of electroencephalography
Hu, Hai; Guo, Shengxin; Liu, Ran
2017-01-01
Artifacts removal and rhythms extraction from electroencephalography (EEG) signals are important for portable and wearable EEG recording devices. Incorporating a novel grouping rule, we proposed an adaptive singular spectrum analysis (SSA) method for artifacts removal and rhythms extraction. Based on the EEG signal amplitude, the grouping rule determines adaptively the first one or two SSA reconstructed components as artifacts and removes them. The remaining reconstructed components are then grouped based on their peak frequencies in the Fourier transform to extract the desired rhythms. The grouping rule thus enables SSA to be adaptive to EEG signals containing different levels of artifacts and rhythms. The simulated EEG data based on the Markov Process Amplitude (MPA) EEG model and the experimental EEG data in the eyes-open and eyes-closed states were used to verify the adaptive SSA method. Results showed a better performance in artifacts removal and rhythms extraction, compared with the wavelet decomposition (WDec) and another two recently reported SSA methods. Features of the extracted alpha rhythms using adaptive SSA were calculated to distinguish between the eyes-open and eyes-closed states. Results showed a higher accuracy (95.8%) than those of the WDec method (79.2%) and the infinite impulse response (IIR) filtering method (83.3%). PMID:28674650
Quality evaluation of motion-compensated edge artifacts in compressed video.
Leontaris, Athanasios; Cosman, Pamela C; Reibman, Amy R
2007-04-01
Little attention has been paid to an impairment common in motion-compensated video compression: the addition of high-frequency (HF) energy as motion compensation displaces blocking artifacts off block boundaries. In this paper, we employ an energy-based approach to measure this motion-compensated edge artifact, using both compressed bitstream information and decoded pixels. We evaluate the performance of our proposed metric, along with several blocking and blurring metrics, on compressed video in two ways. First, ordinal scales are evaluated through a series of expectations that a good quality metric should satisfy: the objective evaluation. Then, the best performing metrics are subjectively evaluated. The same subjective data set is finally used to obtain interval scales to gain more insight. Experimental results show that we accurately estimate the percentage of the added HF energy in compressed video.
Reducing Interpolation Artifacts for Mutual Information Based Image Registration
Soleimani, H.; Khosravifard, M.A.
2011-01-01
Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673
Kanawati, Basem; Bader, Theresa M; Wanczek, Karl-Peter; Li, Yan; Schmitt-Kopplin, Philippe
2017-10-15
Peak picking algorithms in mass spectrometry face the challenge of picking the correct signals from a mass spectrum. In some cases signal wiggles (side lobes) are also chosen in the produced mass list as if they were real signals. Constraints which are defined in such algorithms do not always guarantee wiggle-free accurate mass list generation out of raw mass spectra. This problem intensifies with acquisitions, which are accompanied by longer transients. Thus, the problem represents a contemporary issue, which propagates with modern high-memory digitizers and exists in both MS and MS/MS spectra. A solariX FTMS mass spectrometer with an Infinity ICR cell (Bruker Daltonics, Bremen, Germany) coupled to a 12 Tesla magnet (Magnex, UK) was used for the experimental study. Time-domain transients of several different data point lengths 512k, 1M, 2M, 4M, 8M were obtained and were Fourier-transformed to obtain frequency spectra which show the effect of the transient truncation on sinc wiggle developments in FT-ICR-MS. MATLAB simulations were also performed to investigate the origin of the Fourier transform (FT)-artifacts. A new filter has been developed to identify and remove FT-artifacts (sinc side lobes) from both frequency and mass spectra. The newly developed filter is based on distinguishing between the FWHM of the correct frequency/mass signals and the FWHM of their corresponding wiggles. The filter draws a reliable confidence limit of resolution range, within which a correct frequency/mass signal is identified. The filter is applicable over a wide mass range of metabolic interest (100-1200 amu). The origin of FT-artifacts due to time-domain transient truncations was thoroughly investigated both experimentally and by simulations in this study. A new solution for this problem with automatic recognition and elimination of these FT-artifacts (side lobes/wiggles) is provided, which is independent of any intensity thresholds, magnetic field strengths and time-domain transient lengths. Copyright © 2017 John Wiley & Sons, Ltd.
Uncertainty Analysis for Angle Calibrations Using Circle Closure
Estler, W. Tyler
1998-01-01
We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359
Parks, Nathan A.
2013-01-01
The simultaneous application of transcranial magnetic stimulation (TMS) with non-invasive neuroimaging provides a powerful method for investigating functional connectivity in the human brain and the causal relationships between areas in distributed brain networks. TMS has been combined with numerous neuroimaging techniques including, electroencephalography (EEG), functional magnetic resonance imaging (fMRI), and positron emission tomography (PET). Recent work has also demonstrated the feasibility and utility of combining TMS with non-invasive near-infrared optical imaging techniques, functional near-infrared spectroscopy (fNIRS) and the event-related optical signal (EROS). Simultaneous TMS and optical imaging affords a number of advantages over other neuroimaging methods but also involves a unique set of methodological challenges and considerations. This paper describes the methodology of concurrently performing optical imaging during the administration of TMS, focusing on experimental design, potential artifacts, and approaches to controlling for these artifacts. PMID:24065911
Kruger, David G; Riederer, Stephen J; Rossman, Phillip J; Mostardi, Petrice M; Madhuranthakam, Ananth J; Hu, Houchun H
2005-09-01
MR images formed using extended FOV continuously moving table data acquisition can have signal falloff and loss of lateral spatial resolution at localized, periodic positions along the direction of table motion. In this work we identify the origin of these artifacts and provide a means for correction. The artifacts are due to a mismatch of the phase of signals acquired from contiguous sampling fields of view and are most pronounced when the central k-space views are being sampled. Correction can be performed using the phase information from a periodically sampled central view to adjust the phase of all other views of that view cycle, making the net phase uniform across each axial plane. Results from experimental phantom and contrast-enhanced peripheral MRA studies show that the correction technique substantially eliminates the artifact for a variety of phase encode orders. Copyright (c) 2005 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Lee, Daeho; Lee, Seohyung
2017-11-01
We propose an image stitching method that can remove ghost effects and realign the structure misalignments that occur in common image stitching methods. To reduce the artifacts caused by different parallaxes, an optimal seam pair is selected by comparing the cross correlations from multiple seams detected by variable cost weights. Along the optimal seam pair, a histogram of oriented gradients is calculated, and feature points for matching are detected. The homography is refined using the matching points, and the remaining misalignment is eliminated using the propagation of deformation vectors calculated from matching points. In multiband blending, the overlapping regions are determined from a distance between the matching points to remove overlapping artifacts. The experimental results show that the proposed method more robustly eliminates misalignments and overlapping artifacts than the existing method that uses single seam detection and gradient features.
Zou, Ling; Chen, Shuyue; Sun, Yuqiang; Ma, Zhenghua
2010-08-01
In this paper we present a new method of combining Independent Component Analysis (ICA) and Wavelet de-noising algorithm to extract Evoked Related Potentials (ERPs). First, the extended Infomax-ICA algorithm is used to analyze EEG signals and obtain the independent components (Ics); Then, the Wave Shrink (WS) method is applied to the demixed Ics as an intermediate step; the EEG data were rebuilt by using the inverse ICA based on the new Ics; the ERPs were extracted by using de-noised EEG data after being averaged several trials. The experimental results showed that the combined method and ICA method could remove eye artifacts and muscle artifacts mixed in the ERPs, while the combined method could retain the brain neural activity mixed in the noise Ics and could extract the weak ERPs efficiently from strong background artifacts.
Mannan, Malik M Naeem; Jeong, Myung Y; Kamran, Muhammad A
2016-01-01
Electroencephalography (EEG) is a portable brain-imaging technique with the advantage of high-temporal resolution that can be used to record electrical activity of the brain. However, it is difficult to analyze EEG signals due to the contamination of ocular artifacts, and which potentially results in misleading conclusions. Also, it is a proven fact that the contamination of ocular artifacts cause to reduce the classification accuracy of a brain-computer interface (BCI). It is therefore very important to remove/reduce these artifacts before the analysis of EEG signals for applications like BCI. In this paper, a hybrid framework that combines independent component analysis (ICA), regression and high-order statistics has been proposed to identify and eliminate artifactual activities from EEG data. We used simulated, experimental and standard EEG signals to evaluate and analyze the effectiveness of the proposed method. Results demonstrate that the proposed method can effectively remove ocular artifacts as well as it can preserve the neuronal signals present in EEG data. A comparison with four methods from literature namely ICA, regression analysis, wavelet-ICA (wICA), and regression-ICA (REGICA) confirms the significantly enhanced performance and effectiveness of the proposed method for removal of ocular activities from EEG, in terms of lower mean square error and mean absolute error values and higher mutual information between reconstructed and original EEG.
Mannan, Malik M. Naeem; Jeong, Myung Y.; Kamran, Muhammad A.
2016-01-01
Electroencephalography (EEG) is a portable brain-imaging technique with the advantage of high-temporal resolution that can be used to record electrical activity of the brain. However, it is difficult to analyze EEG signals due to the contamination of ocular artifacts, and which potentially results in misleading conclusions. Also, it is a proven fact that the contamination of ocular artifacts cause to reduce the classification accuracy of a brain-computer interface (BCI). It is therefore very important to remove/reduce these artifacts before the analysis of EEG signals for applications like BCI. In this paper, a hybrid framework that combines independent component analysis (ICA), regression and high-order statistics has been proposed to identify and eliminate artifactual activities from EEG data. We used simulated, experimental and standard EEG signals to evaluate and analyze the effectiveness of the proposed method. Results demonstrate that the proposed method can effectively remove ocular artifacts as well as it can preserve the neuronal signals present in EEG data. A comparison with four methods from literature namely ICA, regression analysis, wavelet-ICA (wICA), and regression-ICA (REGICA) confirms the significantly enhanced performance and effectiveness of the proposed method for removal of ocular activities from EEG, in terms of lower mean square error and mean absolute error values and higher mutual information between reconstructed and original EEG. PMID:27199714
SVM-Based Spectral Analysis for Heart Rate from Multi-Channel WPPG Sensor Signals.
Xiong, Jiping; Cai, Lisang; Wang, Fei; He, Xiaowei
2017-03-03
Although wrist-type photoplethysmographic (hereafter referred to as WPPG) sensor signals can measure heart rate quite conveniently, the subjects' hand movements can cause strong motion artifacts, and then the motion artifacts will heavily contaminate WPPG signals. Hence, it is challenging for us to accurately estimate heart rate from WPPG signals during intense physical activities. The WWPG method has attracted more attention thanks to the popularity of wrist-worn wearable devices. In this paper, a mixed approach called Mix-SVM is proposed, it can use multi-channel WPPG sensor signals and simultaneous acceleration signals to measurement heart rate. Firstly, we combine the principle component analysis and adaptive filter to remove a part of the motion artifacts. Due to the strong relativity between motion artifacts and acceleration signals, the further denoising problem is regarded as a sparse signals reconstruction problem. Then, we use a spectrum subtraction method to eliminate motion artifacts effectively. Finally, the spectral peak corresponding to heart rate is sought by an SVM-based spectral analysis method. Through the public PPG database in the 2015 IEEE Signal Processing Cup, we acquire the experimental results, i.e., the average absolute error was 1.01 beat per minute, and the Pearson correlation was 0.9972. These results also confirm that the proposed Mix-SVM approach has potential for multi-channel WPPG-based heart rate estimation in the presence of intense physical exercise.
Wagner, Franca; Wimmer, Wilhelm; Leidolt, Lars; Vischer, Mattheus; Weder, Stefan; Wiest, Roland; Mantokoudis, Georgios; Caversaccio, Marco D.
2015-01-01
Objective Cochlear implants (CIs) are standard treatment for postlingually deafened individuals and prelingually deafened children. This human cadaver study evaluated diagnostic usefulness, image quality and artifacts in 1.5T and 3T magnetic resonance (MR) brain scans after CI with a removable magnet. Methods Three criteria (diagnostic usefulness, image quality, artifacts) were assessed at 1.5T and 3T in five cadaver heads with CI. The brain magnetic resonance scans were performed with and without the magnet in situ. The criteria were analyzed by two blinded neuroradiologists, with focus on image distortion and limitation of the diagnostic value of the acquired MR images. Results MR images with the magnet in situ were all compromised by artifacts caused by the CI. After removal of the magnet, MR scans showed an unequivocal artifact reduction with significant improvement of the image quality and diagnostic usefulness, both at 1.5T and 3T. Visibility of the brain stem, cerebellopontine angle, and parieto-occipital lobe ipsilateral to the CI increased significantly after magnet removal. Conclusions The results indicate the possible advantages for 1.5T and 3T MR scanning of the brain in CI carriers with removable magnets. Our findings support use of CIs with removable magnets, especially in patients with chronic intracranial pathologies. PMID:26200775
Wagner, Franca; Wimmer, Wilhelm; Leidolt, Lars; Vischer, Mattheus; Weder, Stefan; Wiest, Roland; Mantokoudis, Georgios; Caversaccio, Marco D
2015-01-01
Cochlear implants (CIs) are standard treatment for postlingually deafened individuals and prelingually deafened children. This human cadaver study evaluated diagnostic usefulness, image quality and artifacts in 1.5T and 3T magnetic resonance (MR) brain scans after CI with a removable magnet. Three criteria (diagnostic usefulness, image quality, artifacts) were assessed at 1.5T and 3T in five cadaver heads with CI. The brain magnetic resonance scans were performed with and without the magnet in situ. The criteria were analyzed by two blinded neuroradiologists, with focus on image distortion and limitation of the diagnostic value of the acquired MR images. MR images with the magnet in situ were all compromised by artifacts caused by the CI. After removal of the magnet, MR scans showed an unequivocal artifact reduction with significant improvement of the image quality and diagnostic usefulness, both at 1.5T and 3T. Visibility of the brain stem, cerebellopontine angle, and parieto-occipital lobe ipsilateral to the CI increased significantly after magnet removal. The results indicate the possible advantages for 1.5T and 3T MR scanning of the brain in CI carriers with removable magnets. Our findings support use of CIs with removable magnets, especially in patients with chronic intracranial pathologies.
Supervised Detection of Anomalous Light Curves in Massive Astronomical Catalogs
NASA Astrophysics Data System (ADS)
Nun, Isadora; Pichara, Karim; Protopapas, Pavlos; Kim, Dae-Won
2014-09-01
The development of synoptic sky surveys has led to a massive amount of data for which resources needed for analysis are beyond human capabilities. In order to process this information and to extract all possible knowledge, machine learning techniques become necessary. Here we present a new methodology to automatically discover unknown variable objects in large astronomical catalogs. With the aim of taking full advantage of all information we have about known objects, our method is based on a supervised algorithm. In particular, we train a random forest classifier using known variability classes of objects and obtain votes for each of the objects in the training set. We then model this voting distribution with a Bayesian network and obtain the joint voting distribution among the training objects. Consequently, an unknown object is considered as an outlier insofar it has a low joint probability. By leaving out one of the classes on the training set, we perform a validity test and show that when the random forest classifier attempts to classify unknown light curves (the class left out), it votes with an unusual distribution among the classes. This rare voting is detected by the Bayesian network and expressed as a low joint probability. Our method is suitable for exploring massive data sets given that the training process is performed offline. We tested our algorithm on 20 million light curves from the MACHO catalog and generated a list of anomalous candidates. After analysis, we divided the candidates into two main classes of outliers: artifacts and intrinsic outliers. Artifacts were principally due to air mass variation, seasonal variation, bad calibration, or instrumental errors and were consequently removed from our outlier list and added to the training set. After retraining, we selected about 4000 objects, which we passed to a post-analysis stage by performing a cross-match with all publicly available catalogs. Within these candidates we identified certain known but rare objects such as eclipsing Cepheids, blue variables, cataclysmic variables, and X-ray sources. For some outliers there was no additional information. Among them we identified three unknown variability types and a few individual outliers that will be followed up in order to perform a deeper analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nun, Isadora; Pichara, Karim; Protopapas, Pavlos
The development of synoptic sky surveys has led to a massive amount of data for which resources needed for analysis are beyond human capabilities. In order to process this information and to extract all possible knowledge, machine learning techniques become necessary. Here we present a new methodology to automatically discover unknown variable objects in large astronomical catalogs. With the aim of taking full advantage of all information we have about known objects, our method is based on a supervised algorithm. In particular, we train a random forest classifier using known variability classes of objects and obtain votes for each ofmore » the objects in the training set. We then model this voting distribution with a Bayesian network and obtain the joint voting distribution among the training objects. Consequently, an unknown object is considered as an outlier insofar it has a low joint probability. By leaving out one of the classes on the training set, we perform a validity test and show that when the random forest classifier attempts to classify unknown light curves (the class left out), it votes with an unusual distribution among the classes. This rare voting is detected by the Bayesian network and expressed as a low joint probability. Our method is suitable for exploring massive data sets given that the training process is performed offline. We tested our algorithm on 20 million light curves from the MACHO catalog and generated a list of anomalous candidates. After analysis, we divided the candidates into two main classes of outliers: artifacts and intrinsic outliers. Artifacts were principally due to air mass variation, seasonal variation, bad calibration, or instrumental errors and were consequently removed from our outlier list and added to the training set. After retraining, we selected about 4000 objects, which we passed to a post-analysis stage by performing a cross-match with all publicly available catalogs. Within these candidates we identified certain known but rare objects such as eclipsing Cepheids, blue variables, cataclysmic variables, and X-ray sources. For some outliers there was no additional information. Among them we identified three unknown variability types and a few individual outliers that will be followed up in order to perform a deeper analysis.« less
A novel analytical technique suitable for the identification of plastics.
Nečemer, Marijan; Kump, Peter; Sket, Primož; Plavec, Janez; Grdadolnik, Jože; Zvanut, Maja
2013-01-01
The enormous development and production of plastic materials in the last century resulted in increasing numbers of such kinds of objects. Development of a simple and fast technique to classify different types of plastics could be used in many activities dealing with plastic materials such as packaging of food, sorting of used plastic materials, and also, if technique would be non-destructive, for conservation of plastic artifacts in museum collections, a relatively new field of interest since 1990. In our previous paper we introduced a non-destructive technique for fast identification of unknown plastics based on EDXRF spectrometry,1 using as a case study some plastic artifacts archived in the Museum in order to show the advantages of the nondestructive identification of plastic material. In order to validate our technique it was necessary to apply for this purpose the comparison of analyses with some of the analytical techniques, which are more suitable and so far rather widely applied in identifying some most common sorts of plastic materials.
NASA Astrophysics Data System (ADS)
Xu, S.; Uneri, A.; Khanna, A. Jay; Siewerdsen, J. H.; Stayman, J. W.
2017-04-01
Metal artifacts can cause substantial image quality issues in computed tomography. This is particularly true in interventional imaging where surgical tools or metal implants are in the field-of-view. Moreover, the region-of-interest is often near such devices which is exactly where image quality degradations are largest. Previous work on known-component reconstruction (KCR) has shown the incorporation of a physical model (e.g. shape, material composition, etc) of the metal component into the reconstruction algorithm can significantly reduce artifacts even near the edge of a metal component. However, for such approaches to be effective, they must have an accurate model of the component that include energy-dependent properties of both the metal device and the CT scanner, placing a burden on system characterization and component material knowledge. In this work, we propose a modified KCR approach that adopts a mixed forward model with a polyenergetic model for the component and a monoenergetic model for the background anatomy. This new approach called Poly-KCR jointly estimates a spectral transfer function associated with known components in addition to the background attenuation values. Thus, this approach eliminates both the need to know component material composition a prior as well as the requirement for an energy-dependent characterization of the CT scanner. We demonstrate the efficacy of this novel approach and illustrate its improved performance over traditional and model-based iterative reconstruction methods in both simulation studies and in physical data including an implanted cadaver sample.
Lisciandro, Gregory R; Fosgate, Geoffrey T; Fulton, Robert M
2014-01-01
Lung ultrasound is superior to lung auscultation and supine chest radiography for many respiratory conditions in human patients. Ultrasound diagnoses are based on easily learned patterns of sonographic findings and artifacts in standardized images. By applying the wet lung (ultrasound lung rockets or B-lines, representing interstitial edema) versus dry lung (A-lines with a glide sign) concept many respiratory conditions can be diagnosed or excluded. The ultrasound probe can be used as a visual stethoscope for the evaluation of human lungs because dry artifacts (A-lines with a glide sign) predominate over wet artifacts (ultrasound lung rockets or B-lines). However, the frequency and number of wet lung ultrasound artifacts in dogs with radiographically normal lungs is unknown. Thus, the primary objective was to determine the baseline frequency and number of ultrasound lung rockets in dogs without clinical signs of respiratory disease and with radiographically normal lung findings using an 8-view novel regionally based lung ultrasound examination called Vet BLUE. Frequency of ultrasound lung rockets were statistically compared based on signalment, body condition score, investigator, and reasons for radiography. Ten left-sided heart failure dogs were similarly enrolled. Overall frequency of ultrasound lung rockets was 11% (95% confidence interval, 6-19%) in dogs without respiratory disease versus 100% (95% confidence interval, 74-100%) in those with left-sided heart failure. The low frequency and number of ultrasound lung rockets observed in dogs without respiratory disease and with radiographically normal lungs suggests that Vet BLUE will be clinically useful for the identification of canine respiratory conditions. © 2014 American College of Veterinary Radiology.
caCORRECT2: Improving the accuracy and reliability of microarray data in the presence of artifacts
2011-01-01
Background In previous work, we reported the development of caCORRECT, a novel microarray quality control system built to identify and correct spatial artifacts commonly found on Affymetrix arrays. We have made recent improvements to caCORRECT, including the development of a model-based data-replacement strategy and integration with typical microarray workflows via caCORRECT's web portal and caBIG grid services. In this report, we demonstrate that caCORRECT improves the reproducibility and reliability of experimental results across several common Affymetrix microarray platforms. caCORRECT represents an advance over state-of-art quality control methods such as Harshlighting, and acts to improve gene expression calculation techniques such as PLIER, RMA and MAS5.0, because it incorporates spatial information into outlier detection as well as outlier information into probe normalization. The ability of caCORRECT to recover accurate gene expressions from low quality probe intensity data is assessed using a combination of real and synthetic artifacts with PCR follow-up confirmation and the affycomp spike in data. The caCORRECT tool can be accessed at the website: http://cacorrect.bme.gatech.edu. Results We demonstrate that (1) caCORRECT's artifact-aware normalization avoids the undesirable global data warping that happens when any damaged chips are processed without caCORRECT; (2) When used upstream of RMA, PLIER, or MAS5.0, the data imputation of caCORRECT generally improves the accuracy of microarray gene expression in the presence of artifacts more than using Harshlighting or not using any quality control; (3) Biomarkers selected from artifactual microarray data which have undergone the quality control procedures of caCORRECT are more likely to be reliable, as shown by both spike in and PCR validation experiments. Finally, we present a case study of the use of caCORRECT to reliably identify biomarkers for renal cell carcinoma, yielding two diagnostic biomarkers with potential clinical utility, PRKAB1 and NNMT. Conclusions caCORRECT is shown to improve the accuracy of gene expression, and the reproducibility of experimental results in clinical application. This study suggests that caCORRECT will be useful to clean up possible artifacts in new as well as archived microarray data. PMID:21957981
Edlinger, Christoph; Granitz, Marcel; Paar, Vera; Jung, Christian; Pfeil, Alexander; Eder, Sarah; Wernly, Bernhard; Kammler, Jürgen; Hergan, Klaus; Hoppe, Uta C; Steinwender, Clemens; Lichtenauer, Michael; Kypta, Alexander
2018-05-23
Leadless pacemaker systems are an important upcoming device in clinical rhythmology. Currently two different products are available with the Micra system (Medtronic) being the most used in the clinical setting to date. The possibility to perform magnetic resonance imaging (MRI) is an important feature of modern pacemaker devices. Even though the Micra system is suitable for MRI, little is yet known about its impact on artifacts within the images. The aim of our ex vivo study was to perform cardiac MRI to quantify the artifacts and to evaluate if artifacts limit or inhibit the assessment of the surrounding myocardium. After ex vivo implantation of the leadless pacemaker (LP) in a porcine model, hearts were filled with saline solution and fixed on wooden sticks on a plastic container. The model was examined at 1.5 T and at 3 T using conventional sequences and T2 mapping sequences. In addition, conventional X‑rays and computed tomography (CT) scans were performed. Correct implantation of the LP could be performed in all hearts. In almost all MRI sequences the right ventricle and the septal region surrounding the (LP) were altered by an artifact and therefore would sustain limited assessment; however, the rest of the myocardium remained free of artifacts and evaluable for common radiologic diagnoses. A characteristic shamrock-shaped artifact was generated which appeared to be even more intense in magnitude and brightness when using 3 T compared to 1.5 T. The use of the Micra system in cardiac MRI appeared to be feasible. In our opinion, it will still be possible to make important clinical cardiac MRI diagnoses (the detection of major ischemic areas or inflammatory processes) in patients using the Micra system. We suggest the use of 1.5 T as the preferred method in clinical practice.
Manns, F; Milne, P J; Gonzalez-Cirre, X; Denham, D B; Parel, J M; Robinson, D S
1998-01-01
The purpose of this work was to quantify the magnitude of an artifact induced by stainless steel thermocouple probes in temperature measurements made in situ during experimental laser interstitial thermo-therapy (LITT). A procedure for correction of this observational error is outlined. A CW Nd:YAG laser system emitting 20W for 25-30 s delivered through a fiber-optic probe was used to create localized heating. The temperature field around the fiber-optic probe during laser irradiation was measured every 0.3 s in air, water, 0.4% intralipid solution, and fatty cadaver pig tissue, with a field of up to fifteen needle thermocouple probes. Direct absorption of Nd:YAG laser radiation by the thermocouple probes induced an overestimation of the temperature, ranging from 1.8 degrees C to 118.6 degrees C in air, 2.2 degrees C to 9.9 degrees C in water, 0.7 C to 4.7 C in intralipid and 0.3 C to 17.9 C in porcine tissue after irradiation at 20W for 30 s and depending on the thermocouple location. The artifact in porcine tissue was removed by applying exponential and linear fits to the measured temperature curves. Light absorption by thermocouple probes can induce a significant artifact in the measurement of laser-induced temperature increases. When the time constant of the thermocouple effect is much smaller than the thermal relaxation time of the surrounding tissue, the artifact can be accurately quantified. During LITT experiments where temperature differences of a few degrees are significant, the thermocouple artifact must be removed in order to be able accurately to predict the treatment outcome.
Sequentially reweighted TV minimization for CT metal artifact reduction.
Zhang, Xiaomeng; Xing, Lei
2013-07-01
Metal artifact reduction has long been an important topic in x-ray CT image reconstruction. In this work, the authors propose an iterative method that sequentially minimizes a reweighted total variation (TV) of the image and produces substantially artifact-reduced reconstructions. A sequentially reweighted TV minimization algorithm is proposed to fully exploit the sparseness of image gradients (IG). The authors first formulate a constrained optimization model that minimizes a weighted TV of the image, subject to the constraint that the estimated projection data are within a specified tolerance of the available projection measurements, with image non-negativity enforced. The authors then solve a sequence of weighted TV minimization problems where weights used for the next iteration are computed from the current solution. Using the complete projection data, the algorithm first reconstructs an image from which a binary metal image can be extracted. Forward projection of the binary image identifies metal traces in the projection space. The metal-free background image is then reconstructed from the metal-trace-excluded projection data by employing a different set of weights. Each minimization problem is solved using a gradient method that alternates projection-onto-convex-sets and steepest descent. A series of simulation and experimental studies are performed to evaluate the proposed approach. Our study shows that the sequentially reweighted scheme, by altering a single parameter in the weighting function, flexibly controls the sparsity of the IG and reconstructs artifacts-free images in a two-stage process. It successfully produces images with significantly reduced streak artifacts, suppressed noise and well-preserved contrast and edge properties. The sequentially reweighed TV minimization provides a systematic approach for suppressing CT metal artifacts. The technique can also be generalized to other "missing data" problems in CT image reconstruction.
Cassani, Raymundo; Falk, Tiago H.; Fraga, Francisco J.; Kanda, Paulo A. M.; Anghinah, Renato
2014-01-01
Over the last decade, electroencephalography (EEG) has emerged as a reliable tool for the diagnosis of cortical disorders such as Alzheimer's disease (AD). EEG signals, however, are susceptible to several artifacts, such as ocular, muscular, movement, and environmental. To overcome this limitation, existing diagnostic systems commonly depend on experienced clinicians to manually select artifact-free epochs from the collected multi-channel EEG data. Manual selection, however, is a tedious and time-consuming process, rendering the diagnostic system “semi-automated.” Notwithstanding, a number of EEG artifact removal algorithms have been proposed in the literature. The (dis)advantages of using such algorithms in automated AD diagnostic systems, however, have not been documented; this paper aims to fill this gap. Here, we investigate the effects of three state-of-the-art automated artifact removal (AAR) algorithms (both alone and in combination with each other) on AD diagnostic systems based on four different classes of EEG features, namely, spectral, amplitude modulation rate of change, coherence, and phase. The three AAR algorithms tested are statistical artifact rejection (SAR), blind source separation based on second order blind identification and canonical correlation analysis (BSS-SOBI-CCA), and wavelet enhanced independent component analysis (wICA). Experimental results based on 20-channel resting-awake EEG data collected from 59 participants (20 patients with mild AD, 15 with moderate-to-severe AD, and 24 age-matched healthy controls) showed the wICA algorithm alone outperforming other enhancement algorithm combinations across three tasks: diagnosis (control vs. mild vs. moderate), early detection (control vs. mild), and disease progression (mild vs. moderate), thus opening the doors for fully-automated systems that can assist clinicians with early detection of AD, as well as disease severity progression assessment. PMID:24723886
Passive radiation detection using optically active CMOS sensors
NASA Astrophysics Data System (ADS)
Dosiek, Luke; Schalk, Patrick D.
2013-05-01
Recently, there have been a number of small-scale and hobbyist successes in employing commodity CMOS-based camera sensors for radiation detection. For example, several smartphone applications initially developed for use in areas near the Fukushima nuclear disaster are capable of detecting radiation using a cell phone camera, provided opaque tape is placed over the lens. In all current useful implementations, it is required that the sensor not be exposed to visible light. We seek to build a system that does not have this restriction. While building such a system would require sophisticated signal processing, it would nevertheless provide great benefits. In addition to fulfilling their primary function of image capture, cameras would also be able to detect unknown radiation sources even when the danger is considered to be low or non-existent. By experimentally profiling the image artifacts generated by gamma ray and β particle impacts, algorithms are developed to identify the unique features of radiation exposure, while discarding optical interaction and thermal noise effects. Preliminary results focus on achieving this goal in a laboratory setting, without regard to integration time or computational complexity. However, future work will seek to address these additional issues.
Reconstruction algorithm for polychromatic CT imaging: application to beam hardening correction
NASA Technical Reports Server (NTRS)
Yan, C. H.; Whalen, R. T.; Beaupre, G. S.; Yen, S. Y.; Napel, S.
2000-01-01
This paper presents a new reconstruction algorithm for both single- and dual-energy computed tomography (CT) imaging. By incorporating the polychromatic characteristics of the X-ray beam into the reconstruction process, the algorithm is capable of eliminating beam hardening artifacts. The single energy version of the algorithm assumes that each voxel in the scan field can be expressed as a mixture of two known substances, for example, a mixture of trabecular bone and marrow, or a mixture of fat and flesh. These assumptions are easily satisfied in a quantitative computed tomography (QCT) setting. We have compared our algorithm to three commonly used single-energy correction techniques. Experimental results show that our algorithm is much more robust and accurate. We have also shown that QCT measurements obtained using our algorithm are five times more accurate than that from current QCT systems (using calibration). The dual-energy mode does not require any prior knowledge of the object in the scan field, and can be used to estimate the attenuation coefficient function of unknown materials. We have tested the dual-energy setup to obtain an accurate estimate for the attenuation coefficient function of K2 HPO4 solution.
The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pita-Machado, Reinado; Perez-Diaz, Marlen, E-mail: mperez@uclv.edu.cu; Lorenzo-Ginori, Juan V., E-mail: mperez@uclv.edu.cu
Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which aremore » not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions.« less
NASA Astrophysics Data System (ADS)
Schoenfeld, Andreas A.; Wieker, Soeren; Harder, Dietrich; Poppe, Bjoern
2016-11-01
The optical origin of the lateral response and orientation artifacts, which occur when using EBT3 and EBT-XD radiochromic films together with flatbed scanners, has been reinvestigated by experimental and theoretical means. The common feature of these artifacts is the well-known parabolic increase in the optical density OD(x) = -log10 I(x)/I 0(x) versus offset x from the scanner midline (Poppinga et al 2014 Med. Phys. 41 021707). This holds for landscape and portrait orientations as well as for the three color channels. Dose-independent optical subjects, such as neutral density filters, linear polarizers, the EBT polyester foil and diffusive glass, also present the parabolic lateral artifact when scanned with a flatbed scanner. The curvature parameter c of the parabola function OD(x) = c 0 + cx 2 is found to be a linear function of the dose, the parameters of which are influenced by the film orientation and film type, EBT3 or EBT-XD. The ubiquitous parabolic shape of function OD(x) is attributed (a) to the optical path-length effect (van Battum et al 2016 Phys. Med. Biol. 61 625-49), due to the increasing obliquity of the optical scanner light associated with increasing offset x from the scanner midline, and (b) and (c) to the partial polarization and scattering of the light leaving the film, which affect the ratio ~I(x)/{{I}0}(x) , thus making OD(x) increase with x 2. The orientation effect results from the changes of effects (b) and (c) associated with turning the film position, and thereby the orientation of the polymer structure of the sensitive film layer. In a comparison of experimental results obtained with selected optical subjects, the relative weights of the contributions of the optical path-length effect and the polarization and scattering of light leaving the films to the lateral response artifact have been estimated to be of the same order of magnitude. Mathematical models of these causes for the parabolic shape of function OD(x) are given as appendices.
Schoenfeld, Andreas A; Wieker, Soeren; Harder, Dietrich; Poppe, Bjoern
2016-11-07
The optical origin of the lateral response and orientation artifacts, which occur when using EBT3 and EBT-XD radiochromic films together with flatbed scanners, has been reinvestigated by experimental and theoretical means. The common feature of these artifacts is the well-known parabolic increase in the optical density OD(x) = -log 10 I(x)/I 0 (x) versus offset x from the scanner midline (Poppinga et al 2014 Med. Phys. 41 021707). This holds for landscape and portrait orientations as well as for the three color channels. Dose-independent optical subjects, such as neutral density filters, linear polarizers, the EBT polyester foil and diffusive glass, also present the parabolic lateral artifact when scanned with a flatbed scanner. The curvature parameter c of the parabola function OD(x) = c 0 + cx 2 is found to be a linear function of the dose, the parameters of which are influenced by the film orientation and film type, EBT3 or EBT-XD. The ubiquitous parabolic shape of function OD(x) is attributed (a) to the optical path-length effect (van Battum et al 2016 Phys. Med. Biol. 61 625-49), due to the increasing obliquity of the optical scanner light associated with increasing offset x from the scanner midline, and (b) and (c) to the partial polarization and scattering of the light leaving the film, which affect the ratio [Formula: see text], thus making OD(x) increase with x 2 . The orientation effect results from the changes of effects (b) and (c) associated with turning the film position, and thereby the orientation of the polymer structure of the sensitive film layer. In a comparison of experimental results obtained with selected optical subjects, the relative weights of the contributions of the optical path-length effect and the polarization and scattering of light leaving the films to the lateral response artifact have been estimated to be of the same order of magnitude. Mathematical models of these causes for the parabolic shape of function OD(x) are given as appendices.
MS-QI: A Modulation Spectrum-Based ECG Quality Index for Telehealth Applications.
Tobon V, Diana P; Falk, Tiago H; Maier, Martin
2016-08-01
As telehealth applications emerge, the need for accurate and reliable biosignal quality indices has increased. One typical modality used in remote patient monitoring is the electrocardiogram (ECG), which is inherently susceptible to several different noise sources, including environmental (e.g., powerline interference), experimental (e.g., movement artifacts), and physiological (e.g., muscle and breathing artifacts). Accurate measurement of ECG quality can allow for automated decision support systems to make intelligent decisions about patient conditions. This is particularly true for in-home monitoring applications, where the patient is mobile and the ECG signal can be severely corrupted by movement artifacts. In this paper, we propose an innovative ECG quality index based on the so-called modulation spectral signal representation. The representation quantifies the rate of change of ECG spectral components, which are shown to be different from the rate of change of typical ECG noise sources. The proposed modulation spectral-based quality index, MS-QI, was tested on 1) synthetic ECG signals corrupted by varying levels of noise, 2) single-lead recorded data using the Hexoskin garment during three activity levels (sitting, walking, running), 3) 12-lead recorded data using conventional ECG machines (Computing in Cardiology 2011 dataset), and 4) two-lead ambulatory ECG recorded from arrhythmia patients (MIT-BIH Arrhythmia Database). Experimental results showed the proposed index outperforming two conventional benchmark quality measures, particularly in the scenarios involving recorded data in real-world environments.
Benedetti, L. R.; Holder, J. P.; Perkins, M.; ...
2016-02-26
We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement formore » gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. Furthermore, we have developed a device that can be added to the framing camera head to prevent these artifacts.« less
Kuc, Roman
2018-04-01
This paper describes phase-sensitive and phase-insensitive processing of monaural echolocation waveforms to generate target maps. Composite waveforms containing both the emission and echoes are processed to estimate the target impulse response using an audible sonar. Phase-sensitive processing yields the composite signal envelope, while phase-insensitive processing that starts with the composite waveform power spectrum yields the envelope of the autocorrelation function. Analysis and experimental verification show that multiple echoes form an autocorrelation function that produces near-range phantom-reflector artifacts. These artifacts interfere with true target echoes when the first true echo occurs at a time that is less than the total duration of the target echoes. Initial comparison of phase-sensitive and phase-insensitive maps indicates that both display important target features, indicating that phase is not vital. A closer comparison illustrates the improved resolution of phase-sensitive processing, the near-range phantom-reflectors produced by phase-insensitive processing, and echo interference and multiple reflection artifacts that were independent of the processing.
Boosting specificity of MEG artifact removal by weighted support vector machine.
Duan, Fang; Phothisonothai, Montri; Kikuchi, Mitsuru; Yoshimura, Yuko; Minabe, Yoshio; Watanabe, Kastumi; Aihara, Kazuyuki
2013-01-01
An automatic artifact removal method of magnetoencephalogram (MEG) was presented in this paper. The method proposed is based on independent components analysis (ICA) and support vector machine (SVM). However, different from the previous studies, in this paper we consider two factors which would influence the performance. First, the imbalance factor of independent components (ICs) of MEG is handled by weighted SVM. Second, instead of simply setting a fixed weight to each class, a re-weighting scheme is used for the preservation of useful MEG ICs. Experimental results on manually marked MEG dataset showed that the method proposed could correctly distinguish the artifacts from the MEG ICs. Meanwhile, 99.72% ± 0.67 of MEG ICs were preserved. The classification accuracy was 97.91% ± 1.39. In addition, it was found that this method was not sensitive to individual differences. The cross validation (leave-one-subject-out) results showed an averaged accuracy of 97.41% ± 2.14.
Benedetti, L R; Holder, J P; Perkins, M; Brown, C G; Anderson, C S; Allen, F V; Petre, R B; Hargrove, D; Glenn, S M; Simanovskaia, N; Bradley, D K; Bell, P
2016-02-01
We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement for gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. We have developed a device that can be added to the framing camera head to prevent these artifacts.
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring
Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu
2013-01-01
Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551
Jones, Ryan J. R.; Shinde, Aniketa; Guevarra, Dan; ...
2015-01-05
There are many energy technologies require electrochemical stability or preactivation of functional materials. Due to the long experiment duration required for either electrochemical preactivation or evaluation of operational stability, parallel screening is required to enable high throughput experimentation. We found that imposing operational electrochemical conditions to a library of materials in parallel creates several opportunities for experimental artifacts. We discuss the electrochemical engineering principles and operational parameters that mitigate artifacts int he parallel electrochemical treatment system. We also demonstrate the effects of resistive losses within the planar working electrode through a combination of finite element modeling and illustrative experiments. Operationmore » of the parallel-plate, membrane-separated electrochemical treatment system is demonstrated by exposing a composition library of mixed metal oxides to oxygen evolution conditions in 1M sulfuric acid for 2h. This application is particularly important because the electrolysis and photoelectrolysis of water are promising future energy technologies inhibited by the lack of highly active, acid-stable catalysts containing only earth abundant elements.« less
Methodological considerations for global analysis of cellular FLIM/FRET measurements
NASA Astrophysics Data System (ADS)
Adbul Rahim, Nur Aida; Pelet, Serge; Kamm, Roger D.; So, Peter T. C.
2012-02-01
Global algorithms can improve the analysis of fluorescence energy transfer (FRET) measurement based on fluorescence lifetime microscopy. However, global analysis of FRET data is also susceptible to experimental artifacts. This work examines several common artifacts and suggests remedial experimental protocols. Specifically, we examined the accuracy of different methods for instrument response extraction and propose an adaptive method based on the mean lifetime of fluorescent proteins. We further examined the effects of image segmentation and a priori constraints on the accuracy of lifetime extraction. Methods to test the applicability of global analysis on cellular data are proposed and demonstrated. The accuracy of global fitting degrades with lower photon count. By systematically tracking the effect of the minimum photon count on lifetime and FRET prefactors when carrying out global analysis, we demonstrate a correction procedure to recover the correct FRET parameters, allowing us to obtain protein interaction information even in dim cellular regions with photon counts as low as 100 per decay curve.
Removal of EMG and ECG artifacts from EEG based on wavelet transform and ICA.
Zhou, Weidong; Gotman, Jean
2004-01-01
In this study, the methods of wavelet threshold de-noising and independent component analysis (ICA) are introduced. ICA is a novel signal processing technique based on high order statistics, and is used to separate independent components from measurements. The extended ICA algorithm does not need to calculate the higher order statistics, converges fast, and can be used to separate subGaussian and superGaussian sources. A pre-whitening procedure is performed to de-correlate the mixed signals before extracting sources. The experimental results indicate the electromyogram (EMG) and electrocardiograph (ECG) artifacts in electroencephalograph (EEG) can be removed by a combination of wavelet threshold de-noising and ICA.
Obsidian dating and East african archeology.
Michels, J W; Tsong, I S; Nelson, C M
1983-01-28
New experimental procedures have made it possible to establish specific hydration rates for the numerous compositional types of obsidian to be found at archeological sites in Kenya. Two rates are applied to artifacts from the Prospect Farm site, revealing a history of occupation extending back 120,000 years.
Anastasiadou, Maria N; Christodoulakis, Manolis; Papathanasiou, Eleftherios S; Papacostas, Savvas S; Mitsis, Georgios D
2017-09-01
This paper proposes supervised and unsupervised algorithms for automatic muscle artifact detection and removal from long-term EEG recordings, which combine canonical correlation analysis (CCA) and wavelets with random forests (RF). The proposed algorithms first perform CCA and continuous wavelet transform of the canonical components to generate a number of features which include component autocorrelation values and wavelet coefficient magnitude values. A subset of the most important features is subsequently selected using RF and labelled observations (supervised case) or synthetic data constructed from the original observations (unsupervised case). The proposed algorithms are evaluated using realistic simulation data as well as 30min epochs of non-invasive EEG recordings obtained from ten patients with epilepsy. We assessed the performance of the proposed algorithms using classification performance and goodness-of-fit values for noisy and noise-free signal windows. In the simulation study, where the ground truth was known, the proposed algorithms yielded almost perfect performance. In the case of experimental data, where expert marking was performed, the results suggest that both the supervised and unsupervised algorithm versions were able to remove artifacts without affecting noise-free channels considerably, outperforming standard CCA, independent component analysis (ICA) and Lagged Auto-Mutual Information Clustering (LAMIC). The proposed algorithms achieved excellent performance for both simulation and experimental data. Importantly, for the first time to our knowledge, we were able to perform entirely unsupervised artifact removal, i.e. without using already marked noisy data segments, achieving performance that is comparable to the supervised case. Overall, the results suggest that the proposed algorithms yield significant future potential for improving EEG signal quality in research or clinical settings without the need for marking by expert neurophysiologists, EMG signal recording and user visual inspection. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, Xiaolei; Dong, Hui; Qiu, Yang; Li, Bo; Tao, Quan; Zhang, Yi; Krause, Hans-Joachim; Offenhäusser, Andreas; Xie, Xiaoming
2018-01-01
Power-line harmonic interference and fixed-frequency noise peaks may cause stripe-artifacts in ultra-low field (ULF) magnetic resonance imaging (MRI) in an unshielded environment and in a conductively shielded room. In this paper we describe an adaptive suppression method to eliminate these artifacts in MRI images. This technique utilizes spatial correlation of the interference from different positions, and is realized by subtracting the outputs of the reference channel(s) from those of the signal channel(s) using wavelet analysis and the least squares method. The adaptive suppression method is first implemented to remove the image artifacts in simulation. We then experimentally demonstrate the feasibility of this technique by adding three orthogonal superconducting quantum interference device (SQUID) magnetometers as reference channels to compensate the output of one 2nd-order gradiometer. The experimental results show great improvement in the imaging quality in both 1D and 2D MRI images at two common imaging frequencies, 1.3 kHz and 4.8 kHz. At both frequencies, the effective compensation bandwidth is as high as 2 kHz. Furthermore, we examine the longitudinal relaxation times of the same sample before and after compensation, and show that the MRI properties of the sample did not change after applying adaptive suppression. This technique can effectively increase the imaging bandwidth and be applied to ULF MRI detected by either SQUIDs or Faraday coil in both an unshielded environment and a conductively shielded room.
Schwitter, Juerg; Gold, Michael R; Al Fagih, Ahmed; Lee, Sung; Peterson, Michael; Ciuffo, Allen; Zhang, Yan; Kristiansen, Nina; Kanal, Emanuel; Sommer, Torsten
2016-05-01
Recently, magnetic resonance (MR)-conditional implantable cardioverter defibrillator (ICD) systems have become available. However, associated cardiac MR image (MRI) quality is unknown. The goal was to evaluate the image quality performance of various cardiac MR sequences in a multicenter trial of patients implanted with an MR-conditional ICD system. The Evera-MRI trial enrolled 275 patients in 42 centers worldwide. There were 263 patients implanted with an Evera-MRI single- or dual-chamber ICD and randomized to controls (n=88) and MRI (n=175), 156 of whom underwent a protocol-required MRI (9-12 weeks post implant). Steady-state-free-precession (SSFP) and fast-gradient-echo (FGE) sequences were acquired in short-axis and horizontal long-axis orientations. Qualitative and quantitative assessment of image quality was performed by using a 7-point scale (grades 1-3: good quality, grades 6-7: nondiagnostic) and measuring ICD- and lead-related artifact size. Good to moderate image quality (grades 1-5) was obtained in 53% and 74% of SSFP and FGE acquisitions, respectively, covering the left ventricle, and in 69% and 84%, respectively, covering the right ventricle. Odds for better image quality were greater for right ventricle versus left ventricle (odds ratio, 1.8; 95% confidence interval, 1.5-2.2; P<0.0001) and greater for FGE versus SSFP (odds ratio, 3.5; 95% confidence interval, 2.5-4.8; P<0.0001). Compared with SSFP, ICD-related artifacts on FGE were smaller (141±65 versus 75±57 mm, respectively; P<0.0001). Lead artifacts were much smaller than ICD artifacts (P<0.0001). FGE yields good to moderate quality in 74% of left ventricle and 84% of right ventricle acquisitions and performs better than SSFP in patients with an MRI-conditional ICD system. In these patients, cardiac MRI can offer diagnostic information in most cases. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02117414. © 2016 American Heart Association, Inc.
An EEMD-ICA Approach to Enhancing Artifact Rejection for Noisy Multivariate Neural Data.
Zeng, Ke; Chen, Dan; Ouyang, Gaoxiang; Wang, Lizhe; Liu, Xianzeng; Li, Xiaoli
2016-06-01
As neural data are generally noisy, artifact rejection is crucial for data preprocessing. It has long been a grand research challenge for an approach which is able: 1) to remove the artifacts and 2) to avoid loss or disruption of the structural information at the same time, thus the risk of introducing bias to data interpretation may be minimized. In this study, an approach (namely EEMD-ICA) was proposed to first decompose multivariate neural data that are possibly noisy into intrinsic mode functions (IMFs) using ensemble empirical mode decomposition (EEMD). Independent component analysis (ICA) was then applied to the IMFs to separate the artifactual components. The approach was tested against the classical ICA and the automatic wavelet ICA (AWICA) methods, which were dominant methods for artifact rejection. In order to evaluate the effectiveness of the proposed approach in handling neural data possibly with intensive noises, experiments on artifact removal were performed using semi-simulated data mixed with a variety of noises. Experimental results indicate that the proposed approach continuously outperforms the counterparts in terms of both normalized mean square error (NMSE) and Structure SIMilarity (SSIM). The superiority becomes even greater with the decrease of SNR in all cases, e.g., SSIM of the EEMD-ICA can almost double that of AWICA and triple that of ICA. To further examine the potentials of the approach in sophisticated applications, the approach together with the counterparts were used to preprocess a real-life epileptic EEG with absence seizure. Experiments were carried out with the focus on characterizing the dynamics of the data after artifact rejection, i.e., distinguishing seizure-free, pre-seizure and seizure states. Using multi-scale permutation entropy to extract feature and linear discriminant analysis for classification, the EEMD-ICA performed the best for classifying the states (87.4%, about 4.1% and 8.7% higher than that of AWICA and ICA respectively), which was closest to the results of the manually selected dataset (89.7%).
Causal Relations Drive Young Children's Induction, Naming, and Categorization
ERIC Educational Resources Information Center
Opfer, John E.; Bulloch, Megan J.
2007-01-01
A number of recent models and experiments have suggested that evidence of early category-based induction is an artifact of perceptual cues provided by experimenters. We tested these accounts against the prediction that different relations (causal versus non-causal) determine the types of perceptual similarity by which children generalize. Young…
Is Young Children's Passive Syntax Semantically Constrained? Evidence from Syntactic Priming
ERIC Educational Resources Information Center
Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.; Sorace, Antonella
2012-01-01
Previous research suggests that English-speaking children comprehend agent-patient verb passives earlier than experiencer-theme verb passives (Maratsos, Fox, Becker, & Chalkley, 1985). We report three experiments examining whether such effects reflect delayed acquisition of the passive syntax or instead are an artifact of the experimental task,…
A new algorithm for reliable and general NMR resonance assignment.
Schmidt, Elena; Güntert, Peter
2012-08-01
The new FLYA automated resonance assignment algorithm determines NMR chemical shift assignments on the basis of peak lists from any combination of multidimensional through-bond or through-space NMR experiments for proteins. Backbone and side-chain assignments can be determined. All experimental data are used simultaneously, thereby exploiting optimally the redundancy present in the input peak lists and circumventing potential pitfalls of assignment strategies in which results obtained in a given step remain fixed input data for subsequent steps. Instead of prescribing a specific assignment strategy, the FLYA resonance assignment algorithm requires only experimental peak lists and the primary structure of the protein, from which the peaks expected in a given spectrum can be generated by applying a set of rules, defined in a straightforward way by specifying through-bond or through-space magnetization transfer pathways. The algorithm determines the resonance assignment by finding an optimal mapping between the set of expected peaks that are assigned by definition but have unknown positions and the set of measured peaks in the input peak lists that are initially unassigned but have a known position in the spectrum. Using peak lists obtained by purely automated peak picking from the experimental spectra of three proteins, FLYA assigned correctly 96-99% of the backbone and 90-91% of all resonances that could be assigned manually. Systematic studies quantified the impact of various factors on the assignment accuracy, namely the extent of missing real peaks and the amount of additional artifact peaks in the input peak lists, as well as the accuracy of the peak positions. Comparing the resonance assignments from FLYA with those obtained from two other existing algorithms showed that using identical experimental input data these other algorithms yielded significantly (40-142%) more erroneous assignments than FLYA. The FLYA resonance assignment algorithm thus has the reliability and flexibility to replace most manual and semi-automatic assignment procedures for NMR studies of proteins.
Sampling Artifacts from Conductive Silicone Tubing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timko, Michael T.; Yu, Zhenhong; Kroll, Jesse
2009-05-15
We report evidence that carbon impregnated conductive silicone tubing used in aerosol sampling systems can introduce two types of experimental artifacts: 1) silicon tubing dynamically absorbs carbon dioxide gas, requiring greater than 5 minutes to reach equilibrium and 2) silicone tubing emits organic contaminants containing siloxane that adsorb onto particles traveling through it and onto downstream quartz fiber filters. The consequence can be substantial for engine exhaust measurements as both artifacts directly impact calculations of particulate mass-based emission indices. The emission of contaminants from the silicone tubing can result in overestimation of organic particle mass concentrations based on real-time aerosolmore » mass spectrometry and the off-line thermal analysis of quartz filters. The adsorption of siloxane contaminants can affect the surface properties of aerosol particles; we observed a marked reduction in the water-affinity of soot particles passed through conductive silicone tubing. These combined observations suggest that the silicone tubing artifacts may have wide consequence for the aerosol community and should, therefore, be used with caution. Gentle heating, physical and chemical properties of the particle carriers, exposure to solvents, and tubing age may influence siloxane uptake. The amount of contamination is expected to increase as the tubing surface area increases and as the particle surface area increases. The effect is observed at ambient temperature and enhanced by mild heating (<100 oC). Further evaluation is warranted.« less
Improved patch-based learning for image deblurring
NASA Astrophysics Data System (ADS)
Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng
2015-05-01
Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.
Multi-Class Motor Imagery EEG Decoding for Brain-Computer Interfaces
Wang, Deng; Miao, Duoqian; Blohm, Gunnar
2012-01-01
Recent studies show that scalp electroencephalography (EEG) as a non-invasive interface has great potential for brain-computer interfaces (BCIs). However, one factor that has limited practical applications for EEG-based BCI so far is the difficulty to decode brain signals in a reliable and efficient way. This paper proposes a new robust processing framework for decoding of multi-class motor imagery (MI) that is based on five main processing steps. (i) Raw EEG segmentation without the need of visual artifact inspection. (ii) Considering that EEG recordings are often contaminated not just by electrooculography (EOG) but also other types of artifacts, we propose to first implement an automatic artifact correction method that combines regression analysis with independent component analysis for recovering the original source signals. (iii) The significant difference between frequency components based on event-related (de-) synchronization and sample entropy is then used to find non-contiguous discriminating rhythms. After spectral filtering using the discriminating rhythms, a channel selection algorithm is used to select only relevant channels. (iv) Feature vectors are extracted based on the inter-class diversity and time-varying dynamic characteristics of the signals. (v) Finally, a support vector machine is employed for four-class classification. We tested our proposed algorithm on experimental data that was obtained from dataset 2a of BCI competition IV (2008). The overall four-class kappa values (between 0.41 and 0.80) were comparable to other models but without requiring any artifact-contaminated trial removal. The performance showed that multi-class MI tasks can be reliably discriminated using artifact-contaminated EEG recordings from a few channels. This may be a promising avenue for online robust EEG-based BCI applications. PMID:23087607
Potato respirometer experiment SO61
NASA Technical Reports Server (NTRS)
Taudvin, P. C.; Szpakowski, T. A.
1971-01-01
The design and manufacture of a respirometer for measuring the oxygen consumption rate of a respiring potato sprout in a Skylab experiment is reported. The device monitors low gravity effects on the biorhythmicity of organisms during space flight. Several experimental runs using bench mounted flight hardware units were inconclusive due to room temperature induced artifacts.
Experimental method to account for structural compliance in nanoindentation measurements
Joseph E. Jakes; Charles R. Frihart; James F. Beecher; Robert J. Moon; D. S. Stone
2008-01-01
The standard Oliver–Pharr nanoindentation analysis tacitly assumes that the specimen is structurally rigid and that it is both semi-infinite and homogeneous. Many specimens violate these assumptions. We show that when the specimen flexes or possesses heterogeneities, such as free edges or interfaces between regions of different properties, artifacts arise...
Näsi, Tiina; Mäki, Hanna; Hiltunen, Petri; Heiskala, Juha; Nissilä, Ilkka; Kotilahti, Kalle; Ilmoniemi, Risto J
2013-03-01
The effect of task-related extracerebral circulatory changes on diffuse optical tomography (DOT) of brain activation was evaluated using experimental data from 14 healthy human subjects and computer simulations. Total hemoglobin responses to weekday-recitation, verbal-fluency, and hand-motor tasks were measured with a high-density optode grid placed on the forehead. The tasks caused varying levels of mental and physical stress, eliciting extracerebral circulatory changes that the reconstruction algorithm was unable to fully distinguish from cerebral hemodynamic changes, resulting in artifacts in the brain activation images. Crosstalk between intra- and extracranial layers was confirmed by the simulations. The extracerebral effects were attenuated by superficial signal regression and depended to some extent on the heart rate, thus allowing identification of hemodynamic changes related to brain activation during the verbal-fluency task. During the hand-motor task, the extracerebral component was stronger, making the separation less clear. DOT provides a tool for distinguishing extracerebral components from signals of cerebral origin. Especially in the case of strong task-related extracerebral circulatory changes, however, sophisticated reconstruction methods are needed to eliminate crosstalk artifacts.
The effect of exit beam phase aberrations on parallel beam coherent x-ray reconstructions
NASA Astrophysics Data System (ADS)
Hruszkewycz, S. O.; Harder, R.; Xiao, X.; Fuoss, P. H.
2010-12-01
Diffraction artifacts from imperfect x-ray windows near the sample are an important consideration in the design of coherent x-ray diffraction measurements. In this study, we used simulated and experimental diffraction patterns in two and three dimensions to explore the effect of phase imperfections in a beryllium window (such as a void or inclusion) on the convergence behavior of phasing algorithms and on the ultimate reconstruction. A predictive relationship between beam wavelength, sample size, and window position was derived to explain the dependence of reconstruction quality on beryllium defect size. Defects corresponding to this prediction cause the most damage to the sample exit wave and induce signature error oscillations during phasing that can be used as a fingerprint of experimental x-ray window artifacts. The relationship between x-ray window imperfection size and coherent x-ray diffractive imaging reconstruction quality explored in this work can play an important role in designing high-resolution in situ coherent imaging instrumentation and will help interpret the phasing behavior of coherent diffraction measured in these in situ environments.
The effect of exit beam phase aberrations on parallel beam coherent x-ray reconstructions.
Hruszkewycz, S O; Harder, R; Xiao, X; Fuoss, P H
2010-12-01
Diffraction artifacts from imperfect x-ray windows near the sample are an important consideration in the design of coherent x-ray diffraction measurements. In this study, we used simulated and experimental diffraction patterns in two and three dimensions to explore the effect of phase imperfections in a beryllium window (such as a void or inclusion) on the convergence behavior of phasing algorithms and on the ultimate reconstruction. A predictive relationship between beam wavelength, sample size, and window position was derived to explain the dependence of reconstruction quality on beryllium defect size. Defects corresponding to this prediction cause the most damage to the sample exit wave and induce signature error oscillations during phasing that can be used as a fingerprint of experimental x-ray window artifacts. The relationship between x-ray window imperfection size and coherent x-ray diffractive imaging reconstruction quality explored in this work can play an important role in designing high-resolution in situ coherent imaging instrumentation and will help interpret the phasing behavior of coherent diffraction measured in these in situ environments.
Lin, Hsiu-Hsia; Chiang, Wen-Chung; Lo, Lun-Jou; Sheng-Pin Hsu, Sam; Wang, Chien-Hsuan; Wan, Shu-Yen
2013-11-01
Combining the maxillofacial cone-beam computed tomography (CBCT) model with its corresponding digital dental model enables an integrated 3-dimensional (3D) representation of skeletal structures, teeth, and occlusions. Undesired artifacts, however, introduce difficulties in the superimposition of both models. We have proposed an artifact-resistant surface-based registration method that is robust and clinically applicable and that does not require markers. A CBCT bone model and a laser-scanned dental model obtained from the same patient were used in developing the method and examining the accuracy of the superimposition. Our method included 4 phases. The first phase was to segment the maxilla from the mandible in the CBCT model. The second phase was to conduct an initial registration to bring the digital dental model and the maxilla and mandible sufficiently close to each other. Third, we manually selected at least 3 corresponding regions on both models by smearing patches on the 3D surfaces. The last phase was to superimpose the digital dental model into the maxillofacial model. Each superimposition process was performed twice by 2 operators with the same object to investigate the intra- and interoperator differences. All collected objects were divided into 3 groups with various degrees of artifacts: artifact-free, critical artifacts, and severe artifacts. The mean errors and root-mean-square (RMS) errors were used to evaluate the accuracy of the superimposition results. Repeated measures analysis of variance and the Wilcoxon rank sum test were used to calculate the intraoperator reproducibility and interoperator reliability. Twenty-four maxilla and mandible objects for evaluation were obtained from 14 patients. The experimental results showed that the mean errors between the 2 original models in the residing fused model ranged from 0.10 to 0.43 mm and that the RMS errors ranged from 0.13 to 0.53 mm. These data were consistent with previously used methods and were clinically acceptable. All measurements of the proposed study exhibited desirable intraoperator reproducibility and interoperator reliability. Regarding the intra- and interoperator mean errors and RMS errors in the nonartifact or critical artifact group, no significant difference between the repeated trials or between operators (P < .05) was observed. The results of the present study have shown that the proposed regional surface-based registration can robustly and accurately superimpose a digital dental model into its corresponding CBCT model. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hanming; Wang, Linyuan; Li, Lei
2016-06-15
Purpose: Metal artifact reduction (MAR) is a major problem and a challenging issue in x-ray computed tomography (CT) examinations. Iterative reconstruction from sinograms unaffected by metals shows promising potential in detail recovery. This reconstruction has been the subject of much research in recent years. However, conventional iterative reconstruction methods easily introduce new artifacts around metal implants because of incomplete data reconstruction and inconsistencies in practical data acquisition. Hence, this work aims at developing a method to suppress newly introduced artifacts and improve the image quality around metal implants for the iterative MAR scheme. Methods: The proposed method consists of twomore » steps based on the general iterative MAR framework. An uncorrected image is initially reconstructed, and the corresponding metal trace is obtained. The iterative reconstruction method is then used to reconstruct images from the unaffected sinogram. In the reconstruction step of this work, an iterative strategy utilizing unmatched projector/backprojector pairs is used. A ramp filter is introduced into the back-projection procedure to restrain the inconsistency components in low frequencies and generate more reliable images of the regions around metals. Furthermore, a constrained total variation (TV) minimization model is also incorporated to enhance efficiency. The proposed strategy is implemented based on an iterative FBP and an alternating direction minimization (ADM) scheme, respectively. The developed algorithms are referred to as “iFBP-TV” and “TV-FADM,” respectively. Two projection-completion-based MAR methods and three iterative MAR methods are performed simultaneously for comparison. Results: The proposed method performs reasonably on both simulation and real CT-scanned datasets. This approach could reduce streak metal artifacts effectively and avoid the mentioned effects in the vicinity of the metals. The improvements are evaluated by inspecting regions of interest and by comparing the root-mean-square errors, normalized mean absolute distance, and universal quality index metrics of the images. Both iFBP-TV and TV-FADM methods outperform other counterparts in all cases. Unlike the conventional iterative methods, the proposed strategy utilizing unmatched projector/backprojector pairs shows excellent performance in detail preservation and prevention of the introduction of new artifacts. Conclusions: Qualitative and quantitative evaluations of experimental results indicate that the developed method outperforms classical MAR algorithms in suppressing streak artifacts and preserving the edge structural information of the object. In particular, structures lying close to metals can be gradually recovered because of the reduction of artifacts caused by inconsistency effects.« less
Huang, Xiaolei; Dong, Hui; Qiu, Yang; Li, Bo; Tao, Quan; Zhang, Yi; Krause, Hans-Joachim; Offenhäusser, Andreas; Xie, Xiaoming
2018-01-01
Power-line harmonic interference and fixed-frequency noise peaks may cause stripe-artifacts in ultra-low field (ULF) magnetic resonance imaging (MRI) in an unshielded environment and in a conductively shielded room. In this paper we describe an adaptive suppression method to eliminate these artifacts in MRI images. This technique utilizes spatial correlation of the interference from different positions, and is realized by subtracting the outputs of the reference channel(s) from those of the signal channel(s) using wavelet analysis and the least squares method. The adaptive suppression method is first implemented to remove the image artifacts in simulation. We then experimentally demonstrate the feasibility of this technique by adding three orthogonal superconducting quantum interference device (SQUID) magnetometers as reference channels to compensate the output of one 2nd-order gradiometer. The experimental results show great improvement in the imaging quality in both 1D and 2D MRI images at two common imaging frequencies, 1.3 kHz and 4.8 kHz. At both frequencies, the effective compensation bandwidth is as high as 2 kHz. Furthermore, we examine the longitudinal relaxation times of the same sample before and after compensation, and show that the MRI properties of the sample did not change after applying adaptive suppression. This technique can effectively increase the imaging bandwidth and be applied to ULF MRI detected by either SQUIDs or Faraday coil in both an unshielded environment and a conductively shielded room. Copyright © 2017 Elsevier Inc. All rights reserved.
Woody, Michael S; Capitanio, Marco; Ostap, E Michael; Goldman, Yale E
2018-04-30
We characterized experimental artifacts arising from the non-linear response of acousto-optical deflectors (AODs) in an ultra-fast force-clamp optical trap and have shown that using electro-optical deflectors (EODs) instead eliminates these artifacts. We give an example of the effects of these artifacts in our ultra-fast force clamp studies of the interaction of myosin with actin filaments. The experimental setup, based on the concept of Capitanio et al. [Nat. Methods 9, 1013-1019 (2012)] utilizes a bead-actin-bead dumbbell held in two force-clamped optical traps which apply a load to the dumbbell to move it at a constant velocity. When myosin binds to actin, the filament motion stops quickly as the total force from the optical traps is transferred to the actomyosin attachment. We found that in our setup, AODs were unsuitable for beam steering due to non-linear variations in beam intensity and deflection angle as a function of driving frequency, likely caused by low-amplitude standing acoustic waves in the deflectors. These aberrations caused instability in the force feedback loops leading to artifactual jumps in the trap position. We demonstrate that beam steering with EODs improves the performance of our instrument. Combining the superior beam-steering capability of the EODs, force acquisition via back-focal-plane interferometry, and dual high-speed FPGA-based feedback loops, we apply precise and constant loads to study the dynamics of interactions between actin and myosin. The same concept applies to studies of other biomolecular interactions.
Sparsity-based acoustic inversion in cross-sectional multiscale optoacoustic imaging.
Han, Yiyong; Tzoumas, Stratis; Nunes, Antonio; Ntziachristos, Vasilis; Rosenthal, Amir
2015-09-01
With recent advancement in hardware of optoacoustic imaging systems, highly detailed cross-sectional images may be acquired at a single laser shot, thus eliminating motion artifacts. Nonetheless, other sources of artifacts remain due to signal distortion or out-of-plane signals. The purpose of image reconstruction algorithms is to obtain the most accurate images from noisy, distorted projection data. In this paper, the authors use the model-based approach for acoustic inversion, combined with a sparsity-based inversion procedure. Specifically, a cost function is used that includes the L1 norm of the image in sparse representation and a total variation (TV) term. The optimization problem is solved by a numerically efficient implementation of a nonlinear gradient descent algorithm. TV-L1 model-based inversion is tested in the cross section geometry for numerically generated data as well as for in vivo experimental data from an adult mouse. In all cases, model-based TV-L1 inversion showed a better performance over the conventional Tikhonov regularization, TV inversion, and L1 inversion. In the numerical examples, the images reconstructed with TV-L1 inversion were quantitatively more similar to the originating images. In the experimental examples, TV-L1 inversion yielded sharper images and weaker streak artifact. The results herein show that TV-L1 inversion is capable of improving the quality of highly detailed, multiscale optoacoustic images obtained in vivo using cross-sectional imaging systems. As a result of its high fidelity, model-based TV-L1 inversion may be considered as the new standard for image reconstruction in cross-sectional imaging.
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition. PMID:26380294
Wavelet-based edge correlation incorporated iterative reconstruction for undersampled MRI.
Hu, Changwei; Qu, Xiaobo; Guo, Di; Bao, Lijun; Chen, Zhong
2011-09-01
Undersampling k-space is an effective way to decrease acquisition time for MRI. However, aliasing artifacts introduced by undersampling may blur the edges of magnetic resonance images, which often contain important information for clinical diagnosis. Moreover, k-space data is often contaminated by the noise signals of unknown intensity. To better preserve the edge features while suppressing the aliasing artifacts and noises, we present a new wavelet-based algorithm for undersampled MRI reconstruction. The algorithm solves the image reconstruction as a standard optimization problem including a ℓ(2) data fidelity term and ℓ(1) sparsity regularization term. Rather than manually setting the regularization parameter for the ℓ(1) term, which is directly related to the threshold, an automatic estimated threshold adaptive to noise intensity is introduced in our proposed algorithm. In addition, a prior matrix based on edge correlation in wavelet domain is incorporated into the regularization term. Compared with nonlinear conjugate gradient descent algorithm, iterative shrinkage/thresholding algorithm, fast iterative soft-thresholding algorithm and the iterative thresholding algorithm using exponentially decreasing threshold, the proposed algorithm yields reconstructions with better edge recovery and noise suppression. Copyright © 2011 Elsevier Inc. All rights reserved.
Animals In Synchrotrons: Overcoming Challenges For High-Resolution, Live, Small-Animal Imaging
NASA Astrophysics Data System (ADS)
Donnelley, Martin; Parsons, David; Morgan, Kaye; Siu, Karen
2010-07-01
Physiological studies in small animals can be complicated, but the complexity is increased dramatically when performing live-animal synchrotron X-ray imaging studies. Our group has extensive experience in high-resolution live-animal imaging at the Japanese SPring-8 synchrotron, primarily examining airways in two-dimensions. These experiments normally image an area of 1.8 mm×1.2 mm at a pixel resolution of 0.45 μm and are performed with live, intact, anaesthetized mice. There are unique challenges in this experimental setting. Importantly, experiments must be performed in an isolated imaging hutch not specifically designed for small-animal imaging. This requires equipment adapted to remotely monitor animals, maintain their anesthesia, and deliver test substances while collecting images. The horizontal synchrotron X-ray beam has a fixed location and orientation that limits experimental flexibility. The extremely high resolution makes locating anatomical regions-of-interest slow and can result in a high radiation dose, and at this level of magnification small animal movements produce motion-artifacts that can render acquired images unusable. Here we describe our experimental techniques and how we have overcome several challenges involved in performing live mouse synchrotron imaging. Experiments have tested different mouse strains, with hairless strains minimizing overlying skin and hair artifacts. Different anesthetics have also be trialed due to the limited choices available at SPring-8. Tracheal-intubation methods have been refined and controlled-ventilation is now possible using a specialized small-animal ventilator. With appropriate animal restraint and respiratory-gating, motion-artifacts have been minimized. The animal orientation (supine vs. head-high) also appears to affect animal physiology, and can alter image quality. Our techniques and image quality at SPring-8 have dramatically improved and in the near future we plan to translate this experience to the Imaging and Medical Beamline at the Australian Synchrotron. Overcoming these challenges has permitted increasingly sophisticated imaging of animals with synchrotron X-rays, and we expect a bright future for these techniques.
Animals In Synchrotrons: Overcoming Challenges For High-Resolution, Live, Small-Animal Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelley, Martin; Parsons, David; Women's and Children's Health Research Institute, Adelaide, South Australia
Physiological studies in small animals can be complicated, but the complexity is increased dramatically when performing live-animal synchrotron X-ray imaging studies. Our group has extensive experience in high-resolution live-animal imaging at the Japanese SPring-8 synchrotron, primarily examining airways in two-dimensions. These experiments normally image an area of 1.8 mmx1.2 mm at a pixel resolution of 0.45 {mu}m and are performed with live, intact, anaesthetized mice.There are unique challenges in this experimental setting. Importantly, experiments must be performed in an isolated imaging hutch not specifically designed for small-animal imaging. This requires equipment adapted to remotely monitor animals, maintain their anesthesia, andmore » deliver test substances while collecting images. The horizontal synchrotron X-ray beam has a fixed location and orientation that limits experimental flexibility. The extremely high resolution makes locating anatomical regions-of-interest slow and can result in a high radiation dose, and at this level of magnification small animal movements produce motion-artifacts that can render acquired images unusable. Here we describe our experimental techniques and how we have overcome several challenges involved in performing live mouse synchrotron imaging.Experiments have tested different mouse strains, with hairless strains minimizing overlying skin and hair artifacts. Different anesthetics have also be trialed due to the limited choices available at SPring-8. Tracheal-intubation methods have been refined and controlled-ventilation is now possible using a specialized small-animal ventilator. With appropriate animal restraint and respiratory-gating, motion-artifacts have been minimized. The animal orientation (supine vs. head-high) also appears to affect animal physiology, and can alter image quality. Our techniques and image quality at SPring-8 have dramatically improved and in the near future we plan to translate this experience to the Imaging and Medical Beamline at the Australian Synchrotron.Overcoming these challenges has permitted increasingly sophisticated imaging of animals with synchrotron X-rays, and we expect a bright future for these techniques.« less
Adaptation of laser-Doppler flowmetry to measure cerebral blood flow in the fetal sheep.
Lan, J; Hunter, C J; Murata, T; Power, G G
2000-09-01
The purpose of this study was to devise a means to use laser-Doppler flowmetry to measure cerebral perfusion before birth. The method has not been used previously, largely because of intrauterine movement artifacts. To minimize movement artifacts, a probe holder was molded from epoxy putty to the contour of the fetal skull. A curved 18-gauge needle was embedded in the holder. At surgery, the holder, probe, and skull were fixed together with tissue glue. Residual signals were recorded after fetal death and after maternal death 1 h later. These averaged <5% of baseline flow signals, indicating minimal movement artifact. To test the usefulness of the method, cerebral flow responses were measured during moderate fetal hypoxia induced by giving the ewes approximately 10% oxygen in nitrogen to breathe. As fetal arterial PO(2) decreased from 21.1 +/- 0.5 to 10.7 +/- 0.4 Torr during a 30-min period, cerebral perfusion increased progressively to 56 +/- 8% above baseline. Perfusion then returned to baseline levels during a 30-min recovery period. These responses are quantitatively similar to those spot observations that have been recorded earlier using labeled microspheres. We conclude that cerebral perfusion can be successfully measured by using laser-Doppler flowmetry with the unanesthetized, chronically prepared fetal sheep as an experimental model. With this method, relative changes of perfusion from a small volume of the ovine fetal brain can be measured on a continuous basis, and movement artifacts can be reduced to 5% of measured flow values.
Motion-compensated detection of heart rate based on the time registration adaptive filter
NASA Astrophysics Data System (ADS)
Yang, Lei; Zhou, Jinsong; Jing, Juanjuan; Li, Yacan; Wei, Lidong; Feng, Lei; He, Xiaoying; Bu, Meixia; Fu, Xilu
2018-01-01
A non-contact heart rate detection method based on the dual-wavelength technique is proposed and demonstrated experimentally. The heart rate is obtained based on the PhotoPlethysmoGraphy (PPG). Each detection module uses the reflection detection probe which is composed of the LED and the photodiode. It is a well-known fact that the differences in the circuits of two detection modules result in different responses of two modules for motion artifacts. It will cause a time delay between the two signals. This poses a great challenge to compensate the motion artifacts during measurements. In order to solve this problem, we have firstly used the time registration and translated the signals to ensure that the two signals are consistent in time domain. Then the adaptive filter is used to compensate the motion artifacts. Moreover, the data obtained by using this non-contact detection system is compared with those of the conventional finger blood volume pulse (BVP) sensor by simultaneously measuring the heart rate of the subject. During the experiment, the left hand remains stationary and is detected by a conventional finger BVP sensor. Meanwhile, the moving palm of right hand is detected by the proposed system. The data obtained from the proposed non-contact system are consistent and comparable with that of the BVP sensor. This method can effectively suppress the interference caused by the two circuit differences and successfully compensate the motion artifacts. This technology can be used in medical and daily heart rate measurement.
Evolutionary computing based approach for the removal of ECG artifact from the corrupted EEG signal.
Priyadharsini, S Suja; Rajan, S Edward
2014-01-01
Electroencephalogram (EEG) is an important tool for clinical diagnosis of brain-related disorders and problems. However, it is corrupted by various biological artifacts, of which ECG is one among them that reduces the clinical importance of EEG especially for epileptic patients and patients with short neck. To remove the ECG artifact from the measured EEG signal using an evolutionary computing approach based on the concept of Hybrid Adaptive Neuro-Fuzzy Inference System, which helps the Neurologists in the diagnosis and follow-up of encephalopathy. The proposed hybrid learning methods are ANFIS-MA and ANFIS-GA, which uses Memetic Algorithm (MA) and Genetic algorithm (GA) for tuning the antecedent and consequent part of the ANFIS structure individually. The performances of the proposed methods are compared with that of ANFIS and adaptive Recursive Least Squares (RLS) filtering algorithm. The proposed methods are experimentally validated by applying it to the simulated data sets, subjected to non-linearity condition and real polysomonograph data sets. Performance metrics such as sensitivity, specificity and accuracy of the proposed method ANFIS-MA, in terms of correction rate are found to be 93.8%, 100% and 99% respectively, which is better than current state-of-the-art approaches. The evaluation process used and demonstrated effectiveness of the proposed method proves that ANFIS-MA is more effective in suppressing ECG artifacts from the corrupted EEG signals than ANFIS-GA, ANFIS and RLS algorithm.
Vibro-acoustography and multifrequency image compounding.
Urban, Matthew W; Alizad, Azra; Fatemi, Mostafa
2011-08-01
Vibro-acoustography is an ultrasound based imaging modality that can visualize normal and abnormal soft tissue through mapping the acoustic response of the object to a harmonic radiation force at frequency Δf induced by focused ultrasound. In this method, the ultrasound energy is converted from high ultrasound frequencies to a low acoustic frequency (acoustic emission) that is often two orders of magnitude smaller than the ultrasound frequency. The acoustic emission is normally detected by a hydrophone. Depending on the setup, this low frequency sound may reverberate by object boundaries or other structures present in the acoustic paths before it reaches the hydrophone. This effect produces an artifact in the image in the form of gradual variations in image intensity that may compromise image quality. The use of tonebursts with finite length yields acoustic emission at Δf and at sidebands centered about Δf. Multiple images are formed by selectively applying bandpass filters on the acoustic emission at Δf and the associated sidebands. The data at these multiple frequencies are compounded through both coherent and incoherent processes to reduce the acoustic emission reverberation artifacts. Experimental results from a urethane breast phantom are described. The coherent and incoherent compounding of multifrequency data show, both qualitatively and quantitatively, the efficacy of this reverberation reduction method. This paper presents theory describing the physical origin of this artifact and use of image data created using multifrequency vibro-acoustography for reducing reverberation artifacts. Copyright © 2011 Elsevier B.V. All rights reserved.
O-space with high resolution readouts outperforms radial imaging.
Wang, Haifeng; Tam, Leo; Kopanoglu, Emre; Peters, Dana C; Constable, R Todd; Galiana, Gigi
2017-04-01
While O-Space imaging is well known to accelerate image acquisition beyond traditional Cartesian sampling, its advantages compared to undersampled radial imaging, the linear trajectory most akin to O-Space imaging, have not been detailed. In addition, previous studies have focused on ultrafast imaging with very high acceleration factors and relatively low resolution. The purpose of this work is to directly compare O-Space and radial imaging in their potential to deliver highly undersampled images of high resolution and minimal artifacts, as needed for diagnostic applications. We report that the greatest advantages to O-Space imaging are observed with extended data acquisition readouts. A sampling strategy that uses high resolution readouts is presented and applied to compare the potential of radial and O-Space sequences to generate high resolution images at high undersampling factors. Simulations and phantom studies were performed to investigate whether use of extended readout windows in O-Space imaging would increase k-space sampling and improve image quality, compared to radial imaging. Experimental O-Space images acquired with high resolution readouts show fewer artifacts and greater sharpness than radial imaging with equivalent scan parameters. Radial images taken with longer readouts show stronger undersampling artifacts, which can cause small or subtle image features to disappear. These features are preserved in a comparable O-Space image. High resolution O-Space imaging yields highly undersampled images of high resolution and minimal artifacts. The additional nonlinear gradient field improves image quality beyond conventional radial imaging. Copyright © 2016 Elsevier Inc. All rights reserved.
Vibro-acoustography and Multifrequency Image Compounding
Urban, Matthew W.; Alizad, Azra; Fatemi, Mostafa
2011-01-01
Vibro-acoustography is an ultrasound based imaging modality that can visualize normal and abnormal soft tissue through mapping the acoustic response of the object to a harmonic radiation force at frequency Δf induced by focused ultrasound. In this method, the ultrasound energy is converted from high ultrasound frequencies to a low acoustic frequency (acoustic emission) that is often two orders of magnitude smaller than the ultrasound frequency. The acoustic emission is normally detected by a hydrophone. Depending on the setup, this low frequency sound may reverberate by object boundaries or other structures present in the acoustic paths before it reaches the hydrophone. This effect produces an artifact in the image in the form of gradual variations in image intensity that may compromise image quality. The use of tonebursts with finite length yields acoustic emission at Δf and at sidebands centered about Δf. Multiple images are formed by selectively applying bandpass filters on the acoustic emission at Δf and the associated sidebands. The data at these multiple frequencies are compounded through both coherent and incoherent processes to reduce the acoustic emission reverberation artifacts. Experimental results from a urethane breast phantom are described. The coherent and incoherent compounding of multifrequency data show, both qualitatively and quantitatively, the efficacy of this reverberation reduction method. This paper presents theory describing the physical origin of this artifact and use of image data created using multifrequency vibro-acoustography for reducing reverberation artifacts. PMID:21377181
Correction of Bowtie-Filter Normalization and Crescent Artifacts for a Clinical CBCT System.
Zhang, Hong; Kong, Vic; Huang, Ke; Jin, Jian-Yue
2017-02-01
To present our experiences in understanding and minimizing bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a clinical cone beam computed tomography system. Bowtie-filter position and profile variations during gantry rotation were studied. Two previously proposed strategies (A and B) were applied to the clinical cone beam computed tomography system to correct bowtie-filter crescent artifacts. Physical calibration and analytical approaches were used to minimize the norm phantom misalignment and to correct for bowtie-filter normalization artifacts. A combined procedure to reduce bowtie-filter crescent artifacts and bowtie-filter normalization artifacts was proposed and tested on a norm phantom, CatPhan, and a patient and evaluated using standard deviation of Hounsfield unit along a sampling line. The bowtie-filter exhibited not only a translational shift but also an amplitude variation in its projection profile during gantry rotation. Strategy B was better than strategy A slightly in minimizing bowtie-filter crescent artifacts, possibly because it corrected the amplitude variation, suggesting that the amplitude variation plays a role in bowtie-filter crescent artifacts. The physical calibration largely reduced the misalignment-induced bowtie-filter normalization artifacts, and the analytical approach further reduced bowtie-filter normalization artifacts. The combined procedure minimized both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts, with Hounsfield unit standard deviation being 63.2, 45.0, 35.0, and 18.8 Hounsfield unit for the best correction approaches of none, bowtie-filter crescent artifacts, bowtie-filter normalization artifacts, and bowtie-filter normalization artifacts + bowtie-filter crescent artifacts, respectively. The combined procedure also demonstrated reduction of bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a CatPhan and a patient. We have developed a step-by-step procedure that can be directly used in clinical cone beam computed tomography systems to minimize both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts.
Comment on "Protein sequences from mastodon and Tyrannosaurus rex revealed by mass spectrometry".
Pevzner, Pavel A; Kim, Sangtae; Ng, Julio
2008-08-22
Asara et al. (Reports, 13 April 2007, p. 280) reported sequencing of Tyrannosaurus rex proteins and used them to establish the evolutionary relationships between birds and dinosaurs. We argue that the reported T. rex peptides may represent statistical artifacts and call for complete data release to enable experimental and computational verification of their findings.
Collaborative Knowledge Building with Wikis: The Impact of Redundancy and Polarity
ERIC Educational Resources Information Center
Moskaliuk, Johannes; Kimmerle, Joachim; Cress, Ulrike
2012-01-01
Wikis as shared digital artifacts may enable users to participate in processes of knowledge building. To what extent and with which quality knowledge building can take place is assumed to depend on the interrelation between people's prior knowledge and the information available in a wiki. In two experimental studies we examined the impact on…
USDA-ARS?s Scientific Manuscript database
In order to determine the likely effect of global warming on agricultural productivity while avoiding experimental artifacts, there is a need to conduct warming research under conditions as representative as possible of future open fields, i.e., temperature free-air controlled enhancement (T-FACE) e...
A Method for Whole Brain Ex Vivo Magnetic Resonance Imaging with Minimal Susceptibility Artifacts
Shatil, Anwar S.; Matsuda, Kant M.; Figley, Chase R.
2016-01-01
Magnetic resonance imaging (MRI) is a non-destructive technique that is capable of localizing pathologies and assessing other anatomical features (e.g., tissue volume, microstructure, and white matter connectivity) in postmortem, ex vivo human brains. However, when brains are removed from the skull and cerebrospinal fluid (i.e., their normal in vivo magnetic environment), air bubbles and air–tissue interfaces typically cause magnetic susceptibility artifacts that severely degrade the quality of ex vivo MRI data. In this report, we describe a relatively simple and cost-effective experimental setup for acquiring artifact-free ex vivo brain images using a clinical MRI system with standard hardware. In particular, we outline the necessary steps, from collecting an ex vivo human brain to the MRI scanner setup, and have also described changing the formalin (as might be necessary in longitudinal postmortem studies). Finally, we share some representative ex vivo MRI images that have been acquired using the proposed setup in order to demonstrate the efficacy of this approach. We hope that this protocol will provide both clinicians and researchers with a straight-forward and cost-effective solution for acquiring ex vivo MRI data from whole postmortem human brains. PMID:27965620
Drill Holes and Predation Traces versus Abrasion-Induced Artifacts Revealed by Tumbling Experiments
Gorzelak, Przemysław; Salamon, Mariusz A.; Trzęsiok, Dawid; Niedźwiedzki, Robert
2013-01-01
Drill holes made by predators in prey shells are widely considered to be the most unambiguous bodies of evidence of predator-prey interactions in the fossil record. However, recognition of traces of predatory origin from those formed by abiotic factors still waits for a rigorous evaluation as a prerequisite to ascertain predation intensity through geologic time and to test macroevolutionary patterns. New experimental data from tumbling various extant shells demonstrate that abrasion may leave holes strongly resembling the traces produced by drilling predators. They typically represent singular, circular to oval penetrations perpendicular to the shell surface. These data provide an alternative explanation to the drilling predation hypothesis for the origin of holes recorded in fossil shells. Although various non-morphological criteria (evaluation of holes for non-random distribution) and morphometric studies (quantification of the drill hole shape) have been employed to separate biological from abiotic traces, these are probably insufficient to exclude abrasion artifacts, consequently leading to overestimate predation intensity. As a result, from now on, we must adopt more rigorous criteria to appropriately distinguish abrasion artifacts from drill holes, such as microstructural identification of micro-rasping traces. PMID:23505530
Zhu, Haitao; Demachi, Kazuyuki; Sekino, Masaki
2011-09-01
Positive contrast imaging methods produce enhanced signal at large magnetic field gradient in magnetic resonance imaging. Several postprocessing algorithms, such as susceptibility gradient mapping and phase gradient mapping methods, have been applied for positive contrast generation to detect the cells targeted by superparamagnetic iron oxide nanoparticles. In the phase gradient mapping methods, smoothness condition has to be satisfied to keep the phase gradient unwrapped. Moreover, there has been no discussion about the truncation artifact associated with the algorithm of differentiation that is performed in k-space by the multiplication with frequency value. In this work, phase gradient methods are discussed by considering the wrapping problem when the smoothness condition is not satisfied. A region-growing unwrapping algorithm is used in the phase gradient image to solve the problem. In order to reduce the truncation artifact, a cosine function is multiplied in the k-space to eliminate the abrupt change at the boundaries. Simulation, phantom and in vivo experimental results demonstrate that the modified phase gradient mapping methods may produce improved positive contrast effects by reducing truncation or wrapping artifacts. Copyright © 2011 Elsevier Inc. All rights reserved.
Improved image decompression for reduced transform coding artifacts
NASA Technical Reports Server (NTRS)
Orourke, Thomas P.; Stevenson, Robert L.
1994-01-01
The perceived quality of images reconstructed from low bit rate compression is severely degraded by the appearance of transform coding artifacts. This paper proposes a method for producing higher quality reconstructed images based on a stochastic model for the image data. Quantization (scalar or vector) partitions the transform coefficient space and maps all points in a partition cell to a representative reconstruction point, usually taken as the centroid of the cell. The proposed image estimation technique selects the reconstruction point within the quantization partition cell which results in a reconstructed image which best fits a non-Gaussian Markov random field (MRF) image model. This approach results in a convex constrained optimization problem which can be solved iteratively. At each iteration, the gradient projection method is used to update the estimate based on the image model. In the transform domain, the resulting coefficient reconstruction points are projected to the particular quantization partition cells defined by the compressed image. Experimental results will be shown for images compressed using scalar quantization of block DCT and using vector quantization of subband wavelet transform. The proposed image decompression provides a reconstructed image with reduced visibility of transform coding artifacts and superior perceived quality.
Artifacts in Sonography - Part 3.
Bönhof, Jörg A; McLaughlin, Glen
2018-06-01
As a continuation of parts 1 1 and 2 2, this article discusses artifacts as caused by insufficient temporal resolution, artifacts in color and spectral Doppler sonography, and information regarding artifacts in sonography with contrast agents. There are artifacts that occur in B-mode sonography as well as in Doppler imaging methods and sonography with contrast agents, such as slice thickness artifacts and bow artifacts, shadows, mirroring, and artifacts due to refraction that appear, for example, as double images, because they are based on the same formation mechanisms. In addition, there are artifacts specific to Doppler sonography, such as the twinkling artifact, and method-based motion artifacts, such as aliasing, the ureteric jet, and due to tissue vibration. The artifacts specific to contrast mode include echoes from usually highly reflective structures that are not contrast bubbles ("leakage"). Contrast agent can also change the transmitting signal so that even structures not containing contrast agent are echogenic ("pseudoenhancement"). While artifacts can cause problems regarding differential diagnosis, they can also be useful for determining the diagnosis. Therefore, effective use of sonography requires both profound knowledge and skilled interpretation of artifacts. © Georg Thieme Verlag KG Stuttgart · New York.
SU-E-QI-11: Measurement of Renal Pyruvate-To-Lactate Exchange with Hyperpolarized 13C MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adamson, E; Johnson, K; Fain, S
Purpose: Previous work [1] modeling the metabolic flux between hyperpolarized [1-13C]pyruvate and [1-13C]lactate in magnetic resonance spectroscopic imaging (MRSI) experiments failed to account for vascular signal artifacts. Here, we investigate a method to minimize the vascular signal and its impact on the fidelity of metabolic modeling. Methods: MRSI was simulated for renal metabolism in MATLAB both with and without bipolar gradients. The resulting data were fit to a two-site exchange model [1], and the effects of vascular partial volume artifacts on kinetic modeling were assessed. Bipolar gradients were then incorporated into a gradient echo sequence to validate the simulations experimentally.more » The degree of diffusion weighting (b = 32 s/mm{sup 2}) was determined empirically from 1H imaging of murine renal vascular signal. The method was then tested in vivo using MRSI with bipolar gradients following injection of hyperpolarized [1-{sup 13}C]pyruvate (∼80 mM at 20% polarization). Results: In simulations, vascular signal contaminated the renal metabolic signal at resolutions as high as 2 × 2 mm{sup 2} due to partial volume effects. The apparent exchange rate from pyruvate to lactate (k{sub p}) was underestimated in the presence of these artifacts due to contaminating pyruvate signal. Incorporation of bipolar gradients suppressed vascular signal and improved the accuracy of kp estimation. Experimentally, the in vivo results supported the ability of bipolar gradients to suppress vascular signal. The in vivo exchange rate increased, as predicted in simulations, from k{sub p} = 0.012 s-{sup 1} to k{sub p} = 0.020-{sup 1} after vascular signal suppression. Conclusion: We have demonstrated the limited accuracy of the two-site exchange model in the presence of vascular partial volume artifacts. The addition of bipolar gradients suppressed vascular signal and improved model accuracy in simulations. Bipolar gradients largely affected kp estimation in vivo. Currently, slow-flowing spins in small vessels and capillaries are only partially suppressed, so further improvement is possible. Funding support: Seed Grant from the Radiological Society of North America, GE Healthcare, University of Wisconsin Graduate School.« less
Partial Deconvolution with Inaccurate Blur Kernel.
Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei
2017-10-17
Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.
Wavelet approach to artifact noise removal from Capacitive coupled Electrocardiograph.
Lee, Seung Min; Kim, Ko Keun; Park, Kwang Suk
2008-01-01
Capacitive coupled Electrocardiography (ECG) is introduced as non-invasive measurement technology for ubiquitous health care and appliance are spread out widely. Although it has many merits, however, capacitive coupled ECG is very weak for motion artifacts for its non-skin-contact property. There are many studies for artifact problems which treats all artifact signals below 0.8Hz. In our capacitive coupled ECG measurement system, artifacts exist not only below 0.8Hz but also over than 10Hz. Therefore, artifact noise removal algorithm using wavelet method is tested to reject artifact-wandered signal from measured signals. It is observed that using power calculation each decimation step, artifact-wandered signal is removed as low frequency artifacts as high frequency artifacts. Although some original ECG signal is removed with artifact signal, we could level the signal quality for long term measure which shows the best quality ECG signals as we can get.
Protection of Metal Artifacts with the Formation of Metal–Oxalates Complexes by Beauveria bassiana
Joseph, Edith; Cario, Sylvie; Simon, Anaële; Wörle, Marie; Mazzeo, Rocco; Junier, Pilar; Job, Daniel
2012-01-01
Several fungi present high tolerance to toxic metals and some are able to transform metals into metal–oxalate complexes. In this study, the ability of Beauveria bassiana to produce copper oxalates was evaluated. Growth performance was tested on various copper-containing media. B. bassiana proved highly resistant to copper, tolerating concentrations of up to 20 g L−1, and precipitating copper oxalates on all media tested. Chromatographic analyses showed that this species produced oxalic acid as sole metal chelator. The production of metal–oxalates can be used in the restoration and conservation of archeological and modern metal artifacts. The production of copper oxalates was confirmed directly using metallic pieces (both archeological and modern). The conversion of corrosion products into copper oxalates was demonstrated as well. In order to assess whether the capability of B. bassiana to produce metal–oxalates could be applied to other metals, iron and silver were tested as well. Iron appears to be directly sequestered in the wall of the fungal hyphae forming oxalates. However, the formation of a homogeneous layer on the object is not yet optimal. On silver, a co-precipitation of copper and silver oxalates occurred. As this greenish patina would not be acceptable on silver objects, silver reduction was explored as a tarnishing remediation. First experiments showed the transformation of silver nitrate into nanoparticles of elemental silver by an unknown extracellular mechanism. The production of copper oxalates is immediately applicable for the conservation of copper-based artifacts. For iron and silver this is not yet the case. However, the vast ability of B. bassiana to transform toxic metals using different immobilization mechanisms seems to offer considerable possibilities for industrial applications, such as the bioremediation of contaminated soils or the green synthesis of chemicals. PMID:22291684
Paudel, M R; Mackenzie, M; Fallone, B G; Rathee, S
2013-08-01
To evaluate the metal artifacts in kilovoltage computed tomography (kVCT) images that are corrected using a normalized metal artifact reduction (NMAR) method with megavoltage CT (MVCT) prior images. Tissue characterization phantoms containing bilateral steel inserts are used in all experiments. Two MVCT images, one without any metal artifact corrections and the other corrected using a modified iterative maximum likelihood polychromatic algorithm for CT (IMPACT) are translated to pseudo-kVCT images. These are then used as prior images without tissue classification in an NMAR technique for correcting the experimental kVCT image. The IMPACT method in MVCT included an additional model for the pair∕triplet production process and the energy dependent response of the MVCT detectors. An experimental kVCT image, without the metal inserts and reconstructed using the filtered back projection (FBP) method, is artificially patched with the known steel inserts to get a reference image. The regular NMAR image containing the steel inserts that uses tissue classified kVCT prior and the NMAR images reconstructed using MVCT priors are compared with the reference image for metal artifact reduction. The Eclipse treatment planning system is used to calculate radiotherapy dose distributions on the corrected images and on the reference image using the Anisotropic Analytical Algorithm with 6 MV parallel opposed 5×10 cm2 fields passing through the bilateral steel inserts, and the results are compared. Gafchromic film is used to measure the actual dose delivered in a plane perpendicular to the beams at the isocenter. The streaking and shading in the NMAR image using tissue classifications are significantly reduced. However, the structures, including metal, are deformed. Some uniform regions appear to have eroded from one side. There is a large variation of attenuation values inside the metal inserts. Similar results are seen in commercially corrected image. Use of MVCT prior images without tissue classification in NMAR significantly reduces these problems. The radiation dose calculated on the reference image is close to the dose measured using the film. Compared to the reference image, the calculated dose difference in the conventional NMAR image, the corrected images using uncorrected MVCT image, and IMPACT corrected MVCT image as priors is ∼15.5%, ∼5%, and ∼2.7%, respectively, at the isocenter. The deformation and erosion of the structures present in regular NMAR corrected images can be largely reduced by using MVCT priors without tissue segmentation. The attenuation value of metal being incorrect, large dose differences relative to the true value can result when using the conventional NMAR image. This difference can be significantly reduced if MVCT images are used as priors. Reduced tissue deformation, better tissue visualization, and correct information about the electron density of the tissues and metals in the artifact corrected images could help delineate the structures better, as well as calculate radiation dose more correctly, thus enhancing the quality of the radiotherapy treatment planning.
Sparsity-based acoustic inversion in cross-sectional multiscale optoacoustic imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Yiyong; Tzoumas, Stratis; Nunes, Antonio
2015-09-15
Purpose: With recent advancement in hardware of optoacoustic imaging systems, highly detailed cross-sectional images may be acquired at a single laser shot, thus eliminating motion artifacts. Nonetheless, other sources of artifacts remain due to signal distortion or out-of-plane signals. The purpose of image reconstruction algorithms is to obtain the most accurate images from noisy, distorted projection data. Methods: In this paper, the authors use the model-based approach for acoustic inversion, combined with a sparsity-based inversion procedure. Specifically, a cost function is used that includes the L1 norm of the image in sparse representation and a total variation (TV) term. Themore » optimization problem is solved by a numerically efficient implementation of a nonlinear gradient descent algorithm. TV–L1 model-based inversion is tested in the cross section geometry for numerically generated data as well as for in vivo experimental data from an adult mouse. Results: In all cases, model-based TV–L1 inversion showed a better performance over the conventional Tikhonov regularization, TV inversion, and L1 inversion. In the numerical examples, the images reconstructed with TV–L1 inversion were quantitatively more similar to the originating images. In the experimental examples, TV–L1 inversion yielded sharper images and weaker streak artifact. Conclusions: The results herein show that TV–L1 inversion is capable of improving the quality of highly detailed, multiscale optoacoustic images obtained in vivo using cross-sectional imaging systems. As a result of its high fidelity, model-based TV–L1 inversion may be considered as the new standard for image reconstruction in cross-sectional imaging.« less
Scatter correction using a primary modulator on a clinical angiography C-arm CT system.
Bier, Bastian; Berger, Martin; Maier, Andreas; Kachelrieß, Marc; Ritschl, Ludwig; Müller, Kerstin; Choi, Jang-Hwan; Fahrig, Rebecca
2017-09-01
Cone beam computed tomography (CBCT) suffers from a large amount of scatter, resulting in severe scatter artifacts in the reconstructions. Recently, a new scatter correction approach, called improved primary modulator scatter estimation (iPMSE), was introduced. That approach utilizes a primary modulator that is inserted between the X-ray source and the object. This modulation enables estimation of the scatter in the projection domain by optimizing an objective function with respect to the scatter estimate. Up to now the approach has not been implemented on a clinical angiography C-arm CT system. In our work, the iPMSE method is transferred to a clinical C-arm CBCT. Additional processing steps are added in order to compensate for the C-arm scanner motion and the automatic X-ray tube current modulation. These challenges were overcome by establishing a reference modulator database and a block-matching algorithm. Experiments with phantom and experimental in vivo data were performed to evaluate the method. We show that scatter correction using primary modulation is possible on a clinical C-arm CBCT. Scatter artifacts in the reconstructions are reduced with the newly extended method. Compared to a scan with a narrow collimation, our approach showed superior results with an improvement of the contrast and the contrast-to-noise ratio for the phantom experiments. In vivo data are evaluated by comparing the results with a scan with a narrow collimation and with a constant scatter correction approach. Scatter correction using primary modulation is possible on a clinical CBCT by compensating for the scanner motion and the tube current modulation. Scatter artifacts could be reduced in the reconstructions of phantom scans and in experimental in vivo data. © 2017 American Association of Physicists in Medicine.
Ott, Sabine; Gölitz, Philipp; Adamek, Edyta; Royalty, Kevin; Doerfler, Arnd; Struffert, Tobias
2015-08-01
We compared flat-detector computed tomography angiography (FD-CTA) to multislice computed tomography (MS-CTA) and digital subtracted angiography (DSA) for the visualization of experimental aneurysms treated with stents, coils or a combination of both.In 20 rabbits, aneurysms were created using the rabbit elastase aneurysm model. Seven aneurysms were treated with coils, seven with coils and stents, and six with self-expandable stents alone. Imaging was performed by DSA, MS-CTA and FD-CTA immediately after treatment. Multiplanar reconstruction (MPR) was performed and two experienced reviewers compared aneurysm/coil package size, aneurysm occlusion, stent diameters and artifacts for each modality.In aneurysms treated with stents alone, the visualization of the aneurysms was identical in all three imaging modalities. Residual aneurysm perfusion was present in two cases and visible in DSA and FD-CTA but not in MS-CTA. The diameter of coil-packages was overestimated in MS-CT by 56% and only by 16% in FD-CTA compared to DSA (p < 0.05). The diameter of stents was identical for DSA and FD-CTA and was significantly overestimated in MS-CTA (p < 0.05). Beam/metal hardening artifacts impaired image quality more severely in MS-CTA compared to FD-CTA.MS-CTA is impaired by blooming and beam/metal hardening artifacts in the visualization of implanted devices. There was no significant difference between measurements made with noninvasive FD-CTA compared to gold standard of DSA after stenting and after coiling/stent-assisted coiling of aneurysms. FD-CTA may be considered as a non-invasive alternative to the gold standard 2D DSA in selected patients that require follow up imaging after stenting. © The Author(s) 2015.
The New Kilogram Definition and its Implications for High-Precision Mass Tolerance Classes.
Abbott, Patrick J; Kubarych, Zeina J
2013-01-01
The SI unit of mass, the kilogram, is the only remaining artifact definition in the seven fundamental units of the SI system. It will be redefined in terms of the Planck constant as soon as certain experimental conditions, based on recommendations of the Consultative Committee for Mass and Related Quantities (CCM) are met. To better reflect reality, the redefinition will likely be accompanied by an increase in the uncertainties that National Metrology Institutes (NMIs) pass on to customers via artifact dissemination, which could have an impact on the reference standards that are used by secondary calibration laboratories if certain weight tolerances are adopted for use. This paper will compare the legal metrology requirements for precision mass calibration laboratories after the kilogram is redefined with the current capabilities based on the international prototype kilogram (IPK) realization of the kilogram.
López Garrido, Pedro H; González-Sánchez, J; Escobar Briones, Elva
2015-01-01
Corrosion and biofouling phenomena of cast iron and brass were evaluated under natural conditions to determine the degradation process of archeological artifacts. Field exposure studies of experimental materials were conducted over 15 months at an offshore position in the sea of Campeche in the Gulf of Mexico. Corrosion was determined by gravimetric measurements. The community structure of the benthic assemblage inhabiting the surfaces of both materials was evaluated. A total of 53 species was identified. The community in both cases was composed of a small number of species. Encrusting, attached and erect life forms were dominant on iron. Attached life forms were dominant on brass. Biofouling produced a decrease in the weight loss measurements of cast iron samples. Biofouling provided a beneficial factor for in situ preservation of iron archeological artifacts in wreck sites.
ERIC Educational Resources Information Center
Harvey, Nigel; Reimers, Stian
2013-01-01
People's forecasts from time series underestimate future values for upward trends and overestimate them for downward ones. This trend damping may occur because (a) people anchor on the last data point and make insufficient adjustment to take the trend into account, (b) they adjust toward the average of the trends they have encountered within the…
Code of Federal Regulations, 2012 CFR
2012-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2013 CFR
2013-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2014 CFR
2014-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2011 CFR
2011-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... oil contamination in drilling fluids. 1.4This method has been designed to show positive contamination....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Investigation of systematic effects in atmospheric microthermal probe data
NASA Astrophysics Data System (ADS)
Roper, Daniel S.
1992-12-01
The propagation of electromagnetic radiation through the atmosphere is a crucial aspect of laser target acquisition and surveillance systems and is vital to the effective implementation of some Theater Missile Defense systems. Atmospheric turbulence degrades the image or laser beam quality along an optical path. During the past decade, the U.S. Air Force's Geophysics Directorate of Phillips Laboratory collected high speed differential temperature measurements of the atmospheric temperature structure parameter, C sub(t exp 2), and the related index of refraction structure parameter, C sub(n exp 2). The stratospheric results show a 1-2 order of magnitude increase in day turbulence values compared to night. Resolving whether these results were real or an artifact of solar contamination is a critical Theater Missile Defense issue. This thesis analyzed the thermosonde data from an experimental program conducted by the Geophysics Directorate in December 1990 and found strong evidence of solar induced artifacts in the daytime thermal probe data. In addition, this thesis performed a theoretical analysis of the thermal response versus altitude of fine wire probes being used in a new thermosonde system under development at the Naval Postgraduate School. Experimental wind tunnel measurements were conducted to validate the analytical predictions.
Hurrell, Andrew M
2008-06-01
The interaction of an incident sound wave with an acoustically impenetrable two-layer barrier is considered. Of particular interest is the presence of several acoustic wave components in the shadow region of this barrier. A finite difference model capable of simulating this geometry is validated by comparison to the analytical solution for an idealized, hard-soft barrier. A panel comprising a high air-content closed cell foam backed with an elastic (metal) back plate is then examined. The insertion loss of this panel was found to exceed the dynamic range of the measurement system and was thus acoustically impenetrable. Experimental results from such a panel are shown to contain artifacts not present in the diffraction solution, when acoustic waves are incident upon the soft surface. A finite difference analysis of this experimental configuration replicates the presence of the additional field components. Furthermore, the simulated results allow the additional components to be identified as arising from the S(0) and A(0) Lamb modes traveling in the elastic plate. These Lamb mode artifacts are not found to be present in the shadow region when the acoustic waves are incident upon the elastic surface.
Geometric correction method for 3d in-line X-ray phase contrast image reconstruction
2014-01-01
Background Mechanical system with imperfect or misalignment of X-ray phase contrast imaging (XPCI) components causes projection data misplaced, and thus result in the reconstructed slice images of computed tomography (CT) blurred or with edge artifacts. So the features of biological microstructures to be investigated are destroyed unexpectedly, and the spatial resolution of XPCI image is decreased. It makes data correction an essential pre-processing step for CT reconstruction of XPCI. Methods To remove unexpected blurs and edge artifacts, a mathematics model for in-line XPCI is built by considering primary geometric parameters which include a rotation angle and a shift variant in this paper. Optimal geometric parameters are achieved by finding the solution of a maximization problem. And an iterative approach is employed to solve the maximization problem by using a two-step scheme which includes performing a composite geometric transformation and then following a linear regression process. After applying the geometric transformation with optimal parameters to projection data, standard filtered back-projection algorithm is used to reconstruct CT slice images. Results Numerical experiments were carried out on both synthetic and real in-line XPCI datasets. Experimental results demonstrate that the proposed method improves CT image quality by removing both blurring and edge artifacts at the same time compared to existing correction methods. Conclusions The method proposed in this paper provides an effective projection data correction scheme and significantly improves the image quality by removing both blurring and edge artifacts at the same time for in-line XPCI. It is easy to implement and can also be extended to other XPCI techniques. PMID:25069768
Zou, Yuan; Nathan, Viswam; Jafari, Roozbeh
2016-01-01
Electroencephalography (EEG) is the recording of electrical activity produced by the firing of neurons within the brain. These activities can be decoded by signal processing techniques. However, EEG recordings are always contaminated with artifacts which hinder the decoding process. Therefore, identifying and removing artifacts is an important step. Researchers often clean EEG recordings with assistance from independent component analysis (ICA), since it can decompose EEG recordings into a number of artifact-related and event-related potential (ERP)-related independent components. However, existing ICA-based artifact identification strategies mostly restrict themselves to a subset of artifacts, e.g., identifying eye movement artifacts only, and have not been shown to reliably identify artifacts caused by nonbiological origins like high-impedance electrodes. In this paper, we propose an automatic algorithm for the identification of general artifacts. The proposed algorithm consists of two parts: 1) an event-related feature-based clustering algorithm used to identify artifacts which have physiological origins; and 2) the electrode-scalp impedance information employed for identifying nonbiological artifacts. The results on EEG data collected from ten subjects show that our algorithm can effectively detect, separate, and remove both physiological and nonbiological artifacts. Qualitative evaluation of the reconstructed EEG signals demonstrates that our proposed method can effectively enhance the signal quality, especially the quality of ERPs, even for those that barely display ERPs in the raw EEG. The performance results also show that our proposed method can effectively identify artifacts and subsequently enhance the classification accuracies compared to four commonly used automatic artifact removal methods.
Zou, Yuan; Nathan, Viswam; Jafari, Roozbeh
2017-01-01
Electroencephalography (EEG) is the recording of electrical activity produced by the firing of neurons within the brain. These activities can be decoded by signal processing techniques. However, EEG recordings are always contaminated with artifacts which hinder the decoding process. Therefore, identifying and removing artifacts is an important step. Researchers often clean EEG recordings with assistance from Independent Component Analysis (ICA), since it can decompose EEG recordings into a number of artifact-related and event related potential (ERP)-related independent components (ICs). However, existing ICA-based artifact identification strategies mostly restrict themselves to a subset of artifacts, e.g. identifying eye movement artifacts only, and have not been shown to reliably identify artifacts caused by non-biological origins like high-impedance electrodes. In this paper, we propose an automatic algorithm for the identification of general artifacts. The proposed algorithm consists of two parts: 1) an event-related feature based clustering algorithm used to identify artifacts which have physiological origins and 2) the electrode-scalp impedance information employed for identifying non-biological artifacts. The results on EEG data collected from 10 subjects show that our algorithm can effectively detect, separate, and remove both physiological and non-biological artifacts. Qualitative evaluation of the reconstructed EEG signals demonstrates that our proposed method can effectively enhance the signal quality, especially the quality of ERPs, even for those that barely display ERPs in the raw EEG. The performance results also show that our proposed method can effectively identify artifacts and subsequently enhance the classification accuracies compared to four commonly used automatic artifact removal methods. PMID:25415992
Kiser, Patti K; Löhr, Christiane V; Meritet, Danielle; Spagnoli, Sean T; Milovancev, Milan; Russell, Duncan S
2018-05-01
Although quantitative assessment of margins is recommended for describing excision of cutaneous malignancies, there is poor understanding of limitations associated with this technique. We described and quantified histologic artifacts in inked margins and determined the association between artifacts and variance in histologic tumor-free margin (HTFM) measurements based on a novel grading scheme applied to 50 sections of normal canine skin and 56 radial margins taken from 15 different canine mast cell tumors (MCTs). Three broad categories of artifact were 1) tissue deformation at inked edges, 2) ink-associated artifacts, and 3) sectioning-associated artifacts. The most common artifacts in MCT margins were ink-associated artifacts, specifically ink absent from an edge (mean prevalence: 50%) and inappropriate ink coloring (mean: 45%). The prevalence of other artifacts in MCT skin was 4-50%. In MCT margins, frequency-adjusted kappa statistics found fair or better inter-rater reliability for 9 of 10 artifacts; intra-rater reliability was moderate or better in 9 of 10 artifacts. Digital HTFM measurements by 5 blinded pathologists had a median standard deviation (SD) of 1.9 mm (interquartile range: 0.8-3.6 mm; range: 0-6.2 mm). Intraclass correlation coefficients demonstrated good inter-pathologist reliability in HTFM measurement (κ = 0.81). Spearman rank correlation coefficients found negligible correlation between artifacts and HTFM SDs ( r ≤ 0.3). These data confirm that although histologic artifacts commonly occur in inked margin specimens, artifacts are not meaningfully associated with variation in HTFM measurements. Investigators can use the grading scheme presented herein to identify artifacts associated with tissue processing.
SU-E-I-38: Improved Metal Artifact Correction Using Adaptive Dual Energy Calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, X; Elder, E; Roper, J
2015-06-15
Purpose: The empirical dual energy calibration (EDEC) method corrects for beam-hardening artifacts, but shows limited performance on metal artifact correction. In this work, we propose an adaptive dual energy calibration (ADEC) method to correct for metal artifacts. Methods: The empirical dual energy calibration (EDEC) method corrects for beam-hardening artifacts, but shows limited performance on metal artifact correction. In this work, we propose an adaptive dual energy calibration (ADEC) method to correct for metal artifacts. Results: Highly attenuating copper rods cause severe streaking artifacts on standard CT images. EDEC improves the image quality, but cannot eliminate the streaking artifacts. Compared tomore » EDEC, the proposed ADEC method further reduces the streaking resulting from metallic inserts and beam-hardening effects and obtains material decomposition images with significantly improved accuracy. Conclusion: We propose an adaptive dual energy calibration method to correct for metal artifacts. ADEC is evaluated with the Shepp-Logan phantom, and shows superior metal artifact correction performance. In the future, we will further evaluate the performance of the proposed method with phantom and patient data.« less
Takayanagi, Tomoya; Arai, Takehiro; Amanuma, Makoto; Sano, Tomonari; Ichiba, Masato; Ishizaka, Kazumasa; Sekine, Takako; Matsutani, Hideyuki; Morita, Hitomi; Takase, Shinichi
2017-01-01
Coronary computed tomography angiography (CCTA) in patients with pacemaker suffers from metallic lead-induced artifacts, which often interfere with accurate assessment of coronary luminal stenosis. The purpose of this study was to assess a frequency of the lead-induced artifacts and artifact-suppression effect by the single energy metal artifact reduction (SEMAR) technique. Forty-one patients with a dual-chamber pacemaker were evaluated using a 320 multi-detector row CT (MDCT). Among them, 22 patients with motion-free full data reconstruction images were the final candidates. Images with and without the SMEAR technique were subjectively compared, and the degree of metallic artifacts was compared. On images without SEMAR, severe metallic artifacts were often observed in the right coronary artery (#1, #2, #3) and distal anterior descending branch (#8). These artifacts were effectively suppressed by SEMAR, and the luminal accessibility was significantly improved in #3 and #8. While pacemaker leads often cause metallic-induced artifacts, SEMAR technique reduced the artifacts and significantly improved the accessibility of coronary lumen in #3 and #8.
Unknown Gases: Student-Designed Experiments in the Introductory Laboratory.
ERIC Educational Resources Information Center
Hanson, John; Hoyt, Tim
2002-01-01
Introductory students design and carry-out experimental procedures to determine the identity of three unknown gases from a list of eight possibilities: air, nitrogen, oxygen, argon, carbon dioxide, helium, methane, and hydrogen. Students are excited and motivated by the opportunity to come up with their own experimental approach to solving a…
A Multimethod Approach for Investigating Algal Toxicity of Platinum Nanoparticles.
Sørensen, Sara N; Engelbrekt, Christian; Lützhøft, Hans-Christian H; Jiménez-Lamana, Javier; Noori, Jafar S; Alatraktchi, Fatima A; Delgado, Cristina G; Slaveykova, Vera I; Baun, Anders
2016-10-04
The ecotoxicity of platinum nanoparticles (PtNPs) widely used in for example automotive catalytic converters, is largely unknown. This study employs various characterization techniques and toxicity end points to investigate PtNP toxicity toward the green microalgae Pseudokirchneriella subcapitata and Chlamydomonas reinhardtii. Growth rate inhibition occurred in standard ISO tests (EC 50 values of 15-200 mg Pt/L), but also in a double-vial setup, separating cells from PtNPs, thus demonstrating shading as an important artifact for PtNP toxicity. Negligible membrane damage, but substantial oxidative stress was detected at 0.1-80 mg Pt/L in both algal species using flow cytometry. PtNPs caused growth rate inhibition and oxidative stress in P. subcapitata, beyond what was accounted for by dissolved Pt, indicating NP-specific toxicity of PtNPs. Overall, P. subcapitata was found to be more sensitive toward PtNPs and higher body burdens were measured in this species, possibly due to a favored binding of Pt to the polysaccharide-rich cell wall of this algal species. This study highlights the importance of using multimethod approaches in nanoecotoxicological studies to elucidate toxicity mechanisms, influence of NP-interactions with media/organisms, and ultimately to identify artifacts and appropriate end points for NP-ecotoxicity testing.
3D artifact for calibrating kinematic parameters of articulated arm coordinate measuring machines
NASA Astrophysics Data System (ADS)
Zhao, Huining; Yu, Liandong; Xia, Haojie; Li, Weishi; Jiang, Yizhou; Jia, Huakun
2018-06-01
In this paper, a 3D artifact is proposed to calibrate the kinematic parameters of articulated arm coordinate measuring machines (AACMMs). The artifact is composed of 14 reference points with three different heights, which provides 91 different reference lengths, and a method is proposed to calibrate the artifact with laser tracker multi-stations. Therefore, the kinematic parameters of an AACMM can be calibrated in one setup of the proposed artifact, instead of having to adjust the 1D or 2D artifacts to different positions and orientations in the existing methods. As a result, it saves time to calibrate the AACMM with the proposed artifact in comparison with the traditional 1D or 2D artifacts. The performance of the AACMM calibrated with the proposed artifact is verified with a 600.003 mm gauge block. The result shows that the measurement accuracy of the AACMM is improved effectively through calibration with the proposed artifact.
An EEG Data Investigation Using Only Artifacts
2017-02-22
approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...mediation approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...contaminated. Having the spectral results flagged as containing an artifact, means that the consumer of the data has the freedom to decide how to
Mesoscale hybrid calibration artifact
Tran, Hy D.; Claudet, Andre A.; Oliver, Andrew D.
2010-09-07
A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.
Ripple artifact reduction using slice overlap in slice encoding for metal artifact correction.
den Harder, J Chiel; van Yperen, Gert H; Blume, Ulrike A; Bos, Clemens
2015-01-01
Multispectral imaging (MSI) significantly reduces metal artifacts. Yet, especially in techniques that use gradient selection, such as slice encoding for metal artifact correction (SEMAC), a residual ripple artifact may be prominent. Here, an analysis is presented of the ripple artifact and of slice overlap as an approach to reduce the artifact. The ripple artifact was analyzed theoretically to clarify its cause. Slice overlap, conceptually similar to spectral bin overlap in multi-acquisition with variable resonances image combination (MAVRIC), was achieved by reducing the selection gradient and, thus, increasing the slice profile width. Time domain simulations and phantom experiments were performed to validate the analyses and proposed solution. Discontinuities between slices are aggravated by signal displacement in the frequency encoding direction in areas with deviating B0. Specifically, it was demonstrated that ripple artifacts appear only where B0 varies both in-plane and through-plane. Simulations and phantom studies of metal implants confirmed the efficacy of slice overlap to reduce the artifact. The ripple artifact is an important limitation of gradient selection based MSI techniques, and can be understood using the presented simulations. At a scan-time penalty, slice overlap effectively addressed the artifact, thereby improving image quality near metal implants. © 2014 Wiley Periodicals, Inc.
Variable Grid Traveltime Tomography for Near-surface Seismic Imaging
NASA Astrophysics Data System (ADS)
Cai, A.; Zhang, J.
2017-12-01
We present a new algorithm of traveltime tomography for imaging the subsurface with automated variable grids upon geological structures. The nonlinear traveltime tomography along with Tikhonov regularization using conjugate gradient method is a conventional method for near surface imaging. However, model regularization for any regular and even grids assumes uniform resolution. From geophysical point of view, long-wavelength and large scale structures can be reliably resolved, the details along geological boundaries are difficult to resolve. Therefore, we solve a traveltime tomography problem that automatically identifies large scale structures and aggregates grids within the structures for inversion. As a result, the number of velocity unknowns is reduced significantly, and inversion intends to resolve small-scale structures or the boundaries of large-scale structures. The approach is demonstrated by tests on both synthetic and field data. One synthetic model is a buried basalt model with one horizontal layer. Using the variable grid traveltime tomography, the resulted model is more accurate in top layer velocity, and basalt blocks, and leading to a less number of grids. The field data was collected in an oil field in China. The survey was performed in an area where the subsurface structures were predominantly layered. The data set includes 476 shots with a 10 meter spacing and 1735 receivers with a 10 meter spacing. The first-arrival traveltime of the seismogram is picked for tomography. The reciprocal errors of most shots are between 2ms and 6ms. The normal tomography results in fluctuations in layers and some artifacts in the velocity model. In comparison, the implementation of new method with proper threshold provides blocky model with resolved flat layer and less artifacts. Besides, the number of grids reduces from 205,656 to 4,930 and the inversion produces higher resolution due to less unknowns and relatively fine grids in small structures. The variable grid traveltime tomography provides an alternative imaging solution for blocky structures in the subsurface and builds a good starting model for waveform inversion and statics.
Hacke, Uwe G; Venturas, Martin D; MacKinnon, Evan D; Jacobsen, Anna L; Sperry, John S; Pratt, R Brandon
2015-01-01
The standard centrifuge method has been frequently used to measure vulnerability to xylem cavitation. This method has recently been questioned. It was hypothesized that open vessels lead to exponential vulnerability curves, which were thought to be indicative of measurement artifact. We tested this hypothesis in stems of olive (Olea europea) because its long vessels were recently claimed to produce a centrifuge artifact. We evaluated three predictions that followed from the open vessel artifact hypothesis: shorter stems, with more open vessels, would be more vulnerable than longer stems; standard centrifuge-based curves would be more vulnerable than dehydration-based curves; and open vessels would cause an exponential shape of centrifuge-based curves. Experimental evidence did not support these predictions. Centrifuge curves did not vary when the proportion of open vessels was altered. Centrifuge and dehydration curves were similar. At highly negative xylem pressure, centrifuge-based curves slightly overestimated vulnerability compared to the dehydration curve. This divergence was eliminated by centrifuging each stem only once. The standard centrifuge method produced accurate curves of samples containing open vessels, supporting the validity of this technique and confirming its utility in understanding plant hydraulics. Seven recommendations for avoiding artefacts and standardizing vulnerability curve methodology are provided. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.
2016-05-25
tissue is critical to biology. Many factors determine optimal experimental design, including attainable localization precision, ultrastructural...both imaging modalities. Examples include: weak tissue preservation protocols resulting in poor ultrastructure, e.g. mitochondrial cristae membranes...tension effects during sample drying that may result in artifacts44. Samples dried in the presence of polyvinyl alcohol do not have the haziness
An Additive Manufacturing Test Artifact
Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan
2014-01-01
A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039
Classification and simulation of stereoscopic artifacts in mobile 3DTV content
NASA Astrophysics Data System (ADS)
Boev, Atanas; Hollosi, Danilo; Gotchev, Atanas; Egiazarian, Karen
2009-02-01
We identify, categorize and simulate artifacts which might occur during delivery stereoscopic video to mobile devices. We consider the stages of 3D video delivery dataflow: content creation, conversion to the desired format (multiview or source-plus-depth), coding/decoding, transmission, and visualization on 3D display. Human 3D vision works by assessing various depth cues - accommodation, binocular depth cues, pictorial cues and motion parallax. As a consequence any artifact which modifies these cues impairs the quality of a 3D scene. The perceptibility of each artifact can be estimated through subjective tests. The material for such tests needs to contain various artifacts with different amounts of impairment. We present a system for simulation of these artifacts. The artifacts are organized in groups with similar origins, and each group is simulated by a block in a simulation channel. The channel introduces the following groups of artifacts: sensor limitations, geometric distortions caused by camera optics, spatial and temporal misalignments between video channels, spatial and temporal artifacts caused by coding, transmission losses, and visualization artifacts. For the case of source-plus-depth representation, artifacts caused by format conversion are added as well.
A generic EEG artifact removal algorithm based on the multi-channel Wiener filter
NASA Astrophysics Data System (ADS)
Somers, Ben; Francart, Tom; Bertrand, Alexander
2018-06-01
Objective. The electroencephalogram (EEG) is an essential neuro-monitoring tool for both clinical and research purposes, but is susceptible to a wide variety of undesired artifacts. Removal of these artifacts is often done using blind source separation techniques, relying on a purely data-driven transformation, which may sometimes fail to sufficiently isolate artifacts in only one or a few components. Furthermore, some algorithms perform well for specific artifacts, but not for others. In this paper, we aim to develop a generic EEG artifact removal algorithm, which allows the user to annotate a few artifact segments in the EEG recordings to inform the algorithm. Approach. We propose an algorithm based on the multi-channel Wiener filter (MWF), in which the artifact covariance matrix is replaced by a low-rank approximation based on the generalized eigenvalue decomposition. The algorithm is validated using both hybrid and real EEG data, and is compared to other algorithms frequently used for artifact removal. Main results. The MWF-based algorithm successfully removes a wide variety of artifacts with better performance than current state-of-the-art methods. Significance. Current EEG artifact removal techniques often have limited applicability due to their specificity to one kind of artifact, their complexity, or simply because they are too ‘blind’. This paper demonstrates a fast, robust and generic algorithm for removal of EEG artifacts of various types, i.e. those that were annotated as unwanted by the user.
Gurney-Champion, Oliver J; Bruins Slot, Thijs; Lens, Eelco; van der Horst, Astrid; Klaassen, Remy; van Laarhoven, Hanneke W M; van Tienhoven, Geertjan; van Hooft, Jeanin E; Nederveen, Aart J; Bel, Arjan
2016-10-01
Biliary stents may cause susceptibility artifacts, gradient-induced artifacts, and radio frequency (RF) induced artifacts on magnetic resonance images, which can hinder accurate target volume delineation in radiotherapy. In this study, the authors investigated and quantified the magnitude of these artifacts for stents of different materials. Eight biliary stents made of nitinol, platinum-cored nitinol, stainless steel, or polyethylene from seven vendors, with different lengths (57-98 mm) and diameters (3.0-11.7 mm), were placed in a phantom. To quantify the susceptibility artifacts sequence-independently, ΔB0-maps and T2 ∗ -maps were acquired at 1.5 and 3 T. To study the effect of the gradient-induced artifacts at 3 T, signal decay in images obtained with maximum readout gradient-induced artifacts was compared to signal decay in reference scans. To quantify the RF induced artifacts at 3 T, B1-maps were acquired. Finally, ΔB0-maps and T2 ∗ -maps were acquired at 3 T of two pancreatic cancer patients who had received platinum-cored nitinol biliary stents. Outside the stent, susceptibility artifacts dominated the other artifacts. The stainless steel stent produced the largest susceptibility artifacts. The other stents caused decreased T2 ∗ up to 5.1 mm (1.5 T) and 8.5 mm (3 T) from the edge of the stent. For sequences with a higher bandwidth per voxel (1.5 T: BW vox > 275 Hz/voxel; 3 T: BW vox > 500 Hz/voxel), the B0-related susceptibility artifacts were negligible (<0.2 voxels). The polyethylene stent showed no artifacts. In vivo, the changes in B0 and T2 ∗ induced by the stent were larger than typical variations in B0 and T2 ∗ induced by anatomy when the stent was at an angle of 30° with the main magnetic field. Susceptibility artifacts were dominating over the other artifacts. The magnitudes of the susceptibility artifacts were determined sequence-independently. This method allows to include additional safety margins that ensure target irradiation.
Fast digital zooming system using directionally adaptive image interpolation and restoration.
Kang, Wonseok; Jeon, Jaehwan; Yu, Soohwan; Paik, Joonki
2014-01-01
This paper presents a fast digital zooming system for mobile consumer cameras using directionally adaptive image interpolation and restoration methods. The proposed interpolation algorithm performs edge refinement along the initially estimated edge orientation using directionally steerable filters. Either the directionally weighted linear or adaptive cubic-spline interpolation filter is then selectively used according to the refined edge orientation for removing jagged artifacts in the slanted edge region. A novel image restoration algorithm is also presented for removing blurring artifacts caused by the linear or cubic-spline interpolation using the directionally adaptive truncated constrained least squares (TCLS) filter. Both proposed steerable filter-based interpolation and the TCLS-based restoration filters have a finite impulse response (FIR) structure for real time processing in an image signal processing (ISP) chain. Experimental results show that the proposed digital zooming system provides high-quality magnified images with FIR filter-based fast computational structure.
Synthetic biology between technoscience and thing knowledge.
Gelfert, Axel
2013-06-01
Synthetic biology presents a challenge to traditional accounts of biology: Whereas traditional biology emphasizes the evolvability, variability, and heterogeneity of living organisms, synthetic biology envisions a future of homogeneous, humanly engineered biological systems that may be combined in modular fashion. The present paper approaches this challenge from the perspective of the epistemology of technoscience. In particular, it is argued that synthetic-biological artifacts lend themselves to an analysis in terms of what has been called 'thing knowledge'. As such, they should neither be regarded as the simple outcome of applying theoretical knowledge and engineering principles to specific technological problems, nor should they be treated as mere sources of new evidence in the general pursuit of scientific understanding. Instead, synthetic-biological artifacts should be viewed as partly autonomous research objects which, qua their material-biological constitution, embody knowledge about the natural world-knowledge that, in turn, can be accessed via continuous experimental interrogation. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Covarrubias, Ernesto E.; Eshraghi, Mohsen
2018-03-01
Aerospace, automotive, and medical industries use selective laser melting (SLM) to produce complex parts through solidifying successive layers of powder. This additive manufacturing technique has many advantages, but one of the biggest challenges facing this process is the resulting surface quality of the as-built parts. The purpose of this research was to study the surface properties of Inconel 718 alloys fabricated by SLM. The effect of build angle on the surface properties of as-built parts was investigated. Two sets of sample geometries including cube and rectangular artifacts were considered in the study. It was found that, for angles between 15° and 75°, theoretical calculations based on the "stair-step" effect were consistent with the experimental results. Downskin surfaces showed higher average roughness values compared to the upskin surfaces. No significant difference was found between the average roughness values measured from cube and rectangular test artifacts.
Kong, Gang; Dai, Dao-Qing; Zou, Lu-Min
2008-07-01
In order to remove the artifacts of peripheral digital subtraction angiography (DSA), an affine transformation-based automatic image registration algorithm is introduced here. The whole process is described as follows: First, rectangle feature templates are constructed with their centers of the extracted Harris corners in the mask, and motion vectors of the central feature points are estimated using template matching technology with the similarity measure of maximum histogram energy. And then the optimal parameters of the affine transformation are calculated with the matrix singular value decomposition (SVD) method. Finally, bilinear intensity interpolation is taken to the mask according to the specific affine transformation. More than 30 peripheral DSA registrations are performed with the presented algorithm, and as the result, moving artifacts of the images are removed with sub-pixel precision, and the time consumption is less enough to satisfy the clinical requirements. Experimental results show the efficiency and robustness of the algorithm.
Pavlov, A N; Pavlova, O N; Abdurashitov, A S; Sindeeva, O A; Semyachkina-Glushkovskaya, O V; Kurths, J
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
End-to-end learning for digital hologram reconstruction
NASA Astrophysics Data System (ADS)
Xu, Zhimin; Zuo, Si; Lam, Edmund Y.
2018-02-01
Digital holography is a well-known method to perform three-dimensional imaging by recording the light wavefront information originating from the object. Not only the intensity, but also the phase distribution of the wavefront can then be computed from the recorded hologram in the numerical reconstruction process. However, the reconstructions via the traditional methods suffer from various artifacts caused by twin-image, zero-order term, and noise from image sensors. Here we demonstrate that an end-to-end deep neural network (DNN) can learn to perform both intensity and phase recovery directly from an intensity-only hologram. We experimentally show that the artifacts can be effectively suppressed. Meanwhile, our network doesn't need any preprocessing for initialization, and is comparably fast to train and test, in comparison with the recently published learning-based method. In addition, we validate that the performance improvement can be achieved by introducing a prior on sparsity.
Ghost artifact cancellation using phased array processing.
Kellman, P; McVeigh, E R
2001-08-01
In this article, a method for phased array combining is formulated which may be used to cancel ghosts caused by a variety of distortion mechanisms, including space variant distortions such as local flow or off-resonance. This method is based on a constrained optimization, which optimizes SNR subject to the constraint of nulling ghost artifacts at known locations. The resultant technique is similar to the method known as sensitivity encoding (SENSE) used for accelerated imaging; however, in this formulation it is applied to full field-of-view (FOV) images. The method is applied to multishot EPI with noninterleaved phase encode acquisition. A number of benefits, as compared to the conventional interleaved approach, are reduced distortion due to off-resonance, in-plane flow, and EPI delay misalignment, as well as eliminating the need for echo-shifting. Experimental results demonstrate the cancellation for both phantom as well as cardiac imaging examples.
Ghost Artifact Cancellation Using Phased Array Processing
Kellman, Peter; McVeigh, Elliot R.
2007-01-01
In this article, a method for phased array combining is formulated which may be used to cancel ghosts caused by a variety of distortion mechanisms, including space variant distortions such as local flow or off-resonance. This method is based on a constrained optimization, which optimizes SNR subject to the constraint of nulling ghost artifacts at known locations. The resultant technique is similar to the method known as sensitivity encoding (SENSE) used for accelerated imaging; however, in this formulation it is applied to full field-of-view (FOV) images. The method is applied to multishot EPI with noninterleaved phase encode acquisition. A number of benefits, as compared to the conventional interleaved approach, are reduced distortion due to off-resonance, in-plane flow, and EPI delay misalignment, as well as eliminating the need for echo-shifting. Experimental results demonstrate the cancellation for both phantom as well as cardiac imaging examples. PMID:11477638
Bain, Paul G; Kashima, Yoshihisa; Haslam, Nick
2006-08-01
Beliefs that may underlie the importance of human values were investigated in 4 studies, drawing on research that distinguishes natural-kind (natural), nominal-kind (conventional), and artifact (functional) beliefs. Values were best characterized by artifact and nominal-kind beliefs, as well as a natural-kind belief specific to the social domain, "human nature" (Studies 1 and 2). The extent to which values were considered central to human nature was associated with value importance in both Australia and Japan (Study 2), and experimentally manipulating human nature beliefs influenced value importance (Study 3). Beyond their association with importance, human nature beliefs predicted participants' reactions to value trade-offs (Study 1) and to value-laden rhetorical statements (Study 4). Human nature beliefs therefore play a central role in the psychology of values.
A Novel Method for Block Size Forensics Based on Morphological Operations
NASA Astrophysics Data System (ADS)
Luo, Weiqi; Huang, Jiwu; Qiu, Guoping
Passive forensics analysis aims to find out how multimedia data is acquired and processed without relying on pre-embedded or pre-registered information. Since most existing compression schemes for digital images are based on block processing, one of the fundamental steps for subsequent forensics analysis is to detect the presence of block artifacts and estimate the block size for a given image. In this paper, we propose a novel method for blind block size estimation. A 2×2 cross-differential filter is first applied to detect all possible block artifact boundaries, morphological operations are then used to remove the boundary effects caused by the edges of the actual image contents, and finally maximum-likelihood estimation (MLE) is employed to estimate the block size. The experimental results evaluated on over 1300 nature images show the effectiveness of our proposed method. Compared with existing gradient-based detection method, our method achieves over 39% accuracy improvement on average.
NASA Astrophysics Data System (ADS)
Pavlov, A. N.; Pavlova, O. N.; Abdurashitov, A. S.; Sindeeva, O. A.; Semyachkina-Glushkovskaya, O. V.; Kurths, J.
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
Comparison of analytic and iterative digital tomosynthesis reconstructions for thin slab objects
NASA Astrophysics Data System (ADS)
Yun, J.; Kim, D. W.; Ha, S.; Kim, H. K.
2017-11-01
For digital x-ray tomosynthesis of thin slab objects, we compare the tomographic imaging performances obtained from the filtered backprojection (FBP) and simultaneous algebraic reconstruction (SART) algorithms. The imaging performance includes the in-plane molulation-transfer function (MTF), the signal difference-to-noise ratio (SDNR), and the out-of-plane blur artifact or artifact-spread function (ASF). The MTF is measured using a thin tungsten-wire phantom, and the SDNR and the ASF are measured using a thin aluminum-disc phantom embedded in a plastic cylinder. The FBP shows a better MTF performance than the SART. On the contrary, the SART outperforms the FBP with regard to the SDNR and ASF performances. Detailed experimental results and their analysis results are described in this paper. For a more proper use of digital tomosynthesis technique, this study suggests to use a reconstuction algorithm suitable for application-specific purposes.
Four-thousand-year-old gold artifacts from the Lake Titicaca basin, southern Peru.
Aldenderfer, Mark; Craig, Nathan M; Speakman, Robert J; Popelka-Filcoff, Rachel
2008-04-01
Artifacts of cold-hammered native gold have been discovered in a secure and undisturbed Terminal Archaic burial context at Jiskairumoko, a multicomponent Late Archaic-Early Formative period site in the southwestern Lake Titicaca basin, Peru. The burial dates to 3776 to 3690 carbon-14 years before the present (2155 to 1936 calendar years B.C.), making this the earliest worked gold recovered to date not only from the Andes, but from the Americas as well. This discovery lends support to the hypothesis that the earliest metalworking in the Andes was experimentation with native gold. The presence of gold in a society of low-level food producers undergoing social and economic transformations coincident with the onset of sedentary life is an indicator of possible early social inequality and aggrandizing behavior and further shows that hereditary elites and a societal capacity to create significant agricultural surpluses are not requisite for the emergence of metalworking traditions.
Zhang, Yuanke; Lu, Hongbing; Rong, Junyan; Meng, Jing; Shang, Junliang; Ren, Pinghong; Zhang, Junying
2017-09-01
Low-dose CT (LDCT) technique can reduce the x-ray radiation exposure to patients at the cost of degraded images with severe noise and artifacts. Non-local means (NLM) filtering has shown its potential in improving LDCT image quality. However, currently most NLM-based approaches employ a weighted average operation directly on all neighbor pixels with a fixed filtering parameter throughout the NLM filtering process, ignoring the non-stationary noise nature of LDCT images. In this paper, an adaptive NLM filtering scheme on local principle neighborhoods (PC-NLM) is proposed for structure-preserving noise/artifacts reduction in LDCT images. Instead of using neighboring patches directly, in the PC-NLM scheme, the principle component analysis (PCA) is first applied on local neighboring patches of the target patch to decompose the local patches into uncorrelated principle components (PCs), then a NLM filtering is used to regularize each PC of the target patch and finally the regularized components is transformed to get the target patch in image domain. Especially, in the NLM scheme, the filtering parameter is estimated adaptively from local noise level of the neighborhood as well as the signal-to-noise ratio (SNR) of the corresponding PC, which guarantees a "weaker" NLM filtering on PCs with higher SNR and a "stronger" filtering on PCs with lower SNR. The PC-NLM procedure is iteratively performed several times for better removal of the noise and artifacts, and an adaptive iteration strategy is developed to reduce the computational load by determining whether a patch should be processed or not in next round of the PC-NLM filtering. The effectiveness of the presented PC-NLM algorithm is validated by experimental phantom studies and clinical studies. The results show that it can achieve promising gain over some state-of-the-art methods in terms of artifact suppression and structure preservation. With the use of PCA on local neighborhoods to extract principal structural components, as well as adaptive NLM filtering on PCs of the target patch using filtering parameter estimated based on the local noise level and corresponding SNR, the proposed PC-NLM method shows its efficacy in preserving fine anatomical structures and suppressing noise/artifacts in LDCT images. © 2017 American Association of Physicists in Medicine.
Use of cognitive artifacts in chemistry learning
NASA Astrophysics Data System (ADS)
Yengin, Ilker
In everyday life, we interact with cognitive artifacts to receive and/or manipulate information so as to alter our thinking processes. CHEM/TEAC 869Q is a distance course that includes extensive explicit instruction in the use of a cognitive artifact. This study investigates issues related to the design of that online artifact. In order to understand design implications and how cognitive artifacts contribute to students' thinking and learning, a qualitative research methodology was engaged that utilized think aloud sessions. Participants' described constrained and structured cognitive models while using the artifact. The study also was informed by interviews and researcher's field notes. A purposeful sampling method led to the selection of participants, four males and two females, who had no prior history of using a course from the 869 series but who had experienced the scientific content covered by the CHEM869Q course. Analysis of the results showed both that a cognitive artifact may lead users' minds in decision making, and that problem solving processes were affected by cognitive artifact's design. When there is no design flaw, users generally thought that the cognitive artifact was helpful by simplifying steps, overcoming other limitations, and reducing errors in a reliable, effective, and easy to use way. Moreover, results showed that successful implementation of cognitive artifacts into teaching --learning practices depended on user willingness to transfer a task to the artifact. While users may like the idea of benefiting from a cognitive artifact, nevertheless, they may tend to limit their usage. They sometimes think that delegating a task to a cognitive artifact makes them dependent, and that they may not learn how to perform the tasks by themselves. They appear more willing to use a cognitive artifact after they have done the task by themselves.
Improved Image Quality in Head and Neck CT Using a 3D Iterative Approach to Reduce Metal Artifact.
Wuest, W; May, M S; Brand, M; Bayerl, N; Krauss, A; Uder, M; Lell, M
2015-10-01
Metal artifacts from dental fillings and other devices degrade image quality and may compromise the detection and evaluation of lesions in the oral cavity and oropharynx by CT. The aim of this study was to evaluate the effect of iterative metal artifact reduction on CT of the oral cavity and oropharynx. Data from 50 consecutive patients with metal artifacts from dental hardware were reconstructed with standard filtered back-projection, linear interpolation metal artifact reduction (LIMAR), and iterative metal artifact reduction. The image quality of sections that contained metal was analyzed for the severity of artifacts and diagnostic value. A total of 455 sections (mean ± standard deviation, 9.1 ± 4.1 sections per patient) contained metal and were evaluated with each reconstruction method. Sections without metal were not affected by the algorithms and demonstrated image quality identical to each other. Of these sections, 38% were considered nondiagnostic with filtered back-projection, 31% with LIMAR, and only 7% with iterative metal artifact reduction. Thirty-three percent of the sections had poor image quality with filtered back-projection, 46% with LIMAR, and 10% with iterative metal artifact reduction. Thirteen percent of the sections with filtered back-projection, 17% with LIMAR, and 22% with iterative metal artifact reduction were of moderate image quality, 16% of the sections with filtered back-projection, 5% with LIMAR, and 30% with iterative metal artifact reduction were of good image quality, and 1% of the sections with LIMAR and 31% with iterative metal artifact reduction were of excellent image quality. Iterative metal artifact reduction yields the highest image quality in comparison with filtered back-projection and linear interpolation metal artifact reduction in patients with metal hardware in the head and neck area. © 2015 by American Journal of Neuroradiology.
Iterative image-domain ring artifact removal in cone-beam CT
NASA Astrophysics Data System (ADS)
Liang, Xiaokun; Zhang, Zhicheng; Niu, Tianye; Yu, Shaode; Wu, Shibin; Li, Zhicheng; Zhang, Huailing; Xie, Yaoqin
2017-07-01
Ring artifacts in cone beam computed tomography (CBCT) images are caused by pixel gain variations using flat-panel detectors, and may lead to structured non-uniformities and deterioration of image quality. The purpose of this study is to propose a method of general ring artifact removal in CBCT images. This method is based on the polar coordinate system, where the ring artifacts manifest as stripe artifacts. Using relative total variation, the CBCT images are first smoothed to generate template images with fewer image details and ring artifacts. By subtracting the template images from the CBCT images, residual images with image details and ring artifacts are generated. As the ring artifact manifests as a stripe artifact in a polar coordinate system, the artifact image can be extracted by mean value from the residual image; the image details are generated by subtracting the artifact image from the residual image. Finally, the image details are compensated to the template image to generate the corrected images. The proposed framework is iterated until the differences in the extracted ring artifacts are minimized. We use a 3D Shepp-Logan phantom, Catphan©504 phantom, uniform acrylic cylinder, and images from a head patient to evaluate the proposed method. In the experiments using simulated data, the spatial uniformity is increased by 1.68 times and the structural similarity index is increased from 87.12% to 95.50% using the proposed method. In the experiment using clinical data, our method shows high efficiency in ring artifact removal while preserving the image structure and detail. The iterative approach we propose for ring artifact removal in cone-beam CT is practical and attractive for CBCT guided radiation therapy.
Stidd, D A; Theessen, H; Deng, Y; Li, Y; Scholz, B; Rohkohl, C; Jhaveri, M D; Moftakhar, R; Chen, M; Lopes, D K
2014-01-01
Flat panel detector CT images are degraded by streak artifacts caused by radiodense implanted materials such as coils or clips. A new metal artifacts reduction prototype algorithm has been used to minimize these artifacts. The application of this new metal artifacts reduction algorithm was evaluated for flat panel detector CT imaging performed in a routine clinical setting. Flat panel detector CT images were obtained from 59 patients immediately following cerebral endovascular procedures or as surveillance imaging for cerebral endovascular or surgical procedures previously performed. The images were independently evaluated by 7 physicians for metal artifacts reduction on a 3-point scale at 2 locations: immediately adjacent to the metallic implant and 3 cm away from it. The number of visible vessels before and after metal artifacts reduction correction was also evaluated within a 3-cm radius around the metallic implant. The metal artifacts reduction algorithm was applied to the 59 flat panel detector CT datasets without complications. The metal artifacts in the reduction-corrected flat panel detector CT images were significantly reduced in the area immediately adjacent to the implanted metal object (P = .05) and in the area 3 cm away from the metal object (P = .03). The average number of visible vessel segments increased from 4.07 to 5.29 (P = .1235) after application of the metal artifacts reduction algorithm to the flat panel detector CT images. Metal artifacts reduction is an effective method to improve flat panel detector CT images degraded by metal artifacts. Metal artifacts are significantly decreased by the metal artifacts reduction algorithm, and there was a trend toward increased vessel-segment visualization. © 2014 by American Journal of Neuroradiology.
Redox artifacts in electrophysiological recordings
Berman, Jonathan M.
2013-01-01
Electrophysiological techniques make use of Ag/AgCl electrodes that are in direct contact with cells or bath. In the bath, electrodes are exposed to numerous experimental conditions and chemical reagents that can modify electrode voltage. We examined voltage offsets created in Ag/AgCl electrodes by exposure to redox reagents used in electrophysiological studies. Voltage offsets were measured in reference to an electrode separated from the solution by an agar bridge. The reducing reagents Tris-2-carboxyethly-phosphine, dithiothreitol (DTT), and glutathione, as well as the oxidizing agent H2O2 used at experimentally relevant concentrations reacted with Ag in the electrodes to produce voltage offsets. Chloride ions and strong acids and bases produced offsets at millimolar concentrations. Electrolytic depletion of the AgCl layer, to replicate voltage clamp and sustained use, resulted in increased sensitivity to flow and DTT. Offsets were sensitive to electrode silver purity and to the amount and method of chloride deposition. For example, exposure to 10 μM DTT produced a voltage offset between 10 and 284 mV depending on the chloride deposition method. Currents generated by these offsets are significant and dependent on membrane conductance and by extension the expression of ion channels and may therefore appear to be biological in origin. These data demonstrate a new source of artifacts in electrophysiological recordings that can affect measurements obtained from a variety of experimental techniques from patch clamp to two-electrode voltage clamp. PMID:23344161
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, M. R.; Mackenzie, M.; Rathee, S.
2013-08-15
Purpose: To evaluate the metal artifacts in kilovoltage computed tomography (kVCT) images that are corrected using a normalized metal artifact reduction (NMAR) method with megavoltage CT (MVCT) prior images.Methods: Tissue characterization phantoms containing bilateral steel inserts are used in all experiments. Two MVCT images, one without any metal artifact corrections and the other corrected using a modified iterative maximum likelihood polychromatic algorithm for CT (IMPACT) are translated to pseudo-kVCT images. These are then used as prior images without tissue classification in an NMAR technique for correcting the experimental kVCT image. The IMPACT method in MVCT included an additional model formore » the pair/triplet production process and the energy dependent response of the MVCT detectors. An experimental kVCT image, without the metal inserts and reconstructed using the filtered back projection (FBP) method, is artificially patched with the known steel inserts to get a reference image. The regular NMAR image containing the steel inserts that uses tissue classified kVCT prior and the NMAR images reconstructed using MVCT priors are compared with the reference image for metal artifact reduction. The Eclipse treatment planning system is used to calculate radiotherapy dose distributions on the corrected images and on the reference image using the Anisotropic Analytical Algorithm with 6 MV parallel opposed 5 × 10 cm{sup 2} fields passing through the bilateral steel inserts, and the results are compared. Gafchromic film is used to measure the actual dose delivered in a plane perpendicular to the beams at the isocenter.Results: The streaking and shading in the NMAR image using tissue classifications are significantly reduced. However, the structures, including metal, are deformed. Some uniform regions appear to have eroded from one side. There is a large variation of attenuation values inside the metal inserts. Similar results are seen in commercially corrected image. Use of MVCT prior images without tissue classification in NMAR significantly reduces these problems. The radiation dose calculated on the reference image is close to the dose measured using the film. Compared to the reference image, the calculated dose difference in the conventional NMAR image, the corrected images using uncorrected MVCT image, and IMPACT corrected MVCT image as priors is ∼15.5%, ∼5%, and ∼2.7%, respectively, at the isocenter.Conclusions: The deformation and erosion of the structures present in regular NMAR corrected images can be largely reduced by using MVCT priors without tissue segmentation. The attenuation value of metal being incorrect, large dose differences relative to the true value can result when using the conventional NMAR image. This difference can be significantly reduced if MVCT images are used as priors. Reduced tissue deformation, better tissue visualization, and correct information about the electron density of the tissues and metals in the artifact corrected images could help delineate the structures better, as well as calculate radiation dose more correctly, thus enhancing the quality of the radiotherapy treatment planning.« less
Clinical Assessment of Mirror Artifacts in Spectral-Domain Optical Coherence Tomography
Ho, Joseph; Castro, Dinorah P. E.; Castro, Leonardo C.; Chen, Yueli; Liu, Jonathan; Mattox, Cynthia; Krishnan, Chandrasekharan; Fujimoto, James G.; Schuman, Joel S.
2010-01-01
Purpose. To investigate the characteristics of a spectral-domain optical coherence tomography (SD-OCT) image phenomenon known as the mirror artifact, calculate its prevalence, analyze potential risk factors, measure severity, and correlate it to spherical equivalent and central visual acuity (VA). Methods. OCT macular cube 512 × 128 scans taken between January 2008 and February 2009 at the New England Eye Center were analyzed for the presence of mirror artifacts. Artifact severity was determined by the degree of segmentation breakdown that it caused on the macular map. A retrospective review was conducted of the medical records of patients with artifacts and of a random control group without artifacts. Results. Of 1592 patients, 9.3% (148 patients, 200 eyes) had scans that contained mirror artifacts. A significantly more myopic spherical equivalent (P < 0.001), worse VA (P < 0.001), longer axial lengths (P = 0.004), and higher proportions of moderate to high myopia (P < 0.001) were found in patients with mirror artifacts than in patients without artifacts. Worse VA was associated with increased artifact severity (P = 0.04). Conclusions. In all scans analyzed, a high prevalence of mirror artifacts was found. This image artifact was often associated with patients with moderate to high myopia. Improvements in instrumentation may be necessary to resolve this problem in moderately and highly myopic eyes. Operators should be advised to properly position the retina when scanning eyes. In cases in which peripheral abnormalities in topographic measurements of retinal thickness are found, corresponding OCT scans should be examined for the presence of mirror artifacts. PMID:20181840
The Current Experimental Status of the High Tc Problem
NASA Astrophysics Data System (ADS)
Greene, Richard
Over 50,000 experimental papers have been published since 1987 on the copper oxide (cuprate) high Tc superconductors. In this talk, I will attempt to summarize the experimental properties that we presently understand and those that we don't yet understand. I will not speculate on the ``unknown unknowns'', although some examples of these have appeared during the past 30 years of research. I may also present a few slides about the status of iron-based superconductors, the other major class of unconventional high Tc materials.
Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brendel, Bernhard, E-mail: bernhard.brendel@philips.com; Teuffenbach, Maximilian von; Noël, Peter B.
2016-01-15
Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penaltymore » comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts.« less
Carbon nanotube mechanics in scanning probe microscopy
NASA Astrophysics Data System (ADS)
Strus, Mark Christopher
Carbon nanotubes (CNTs) possess unique electrical, thermal, and mechanical properties which have led to the development of novel nanomechanical materials and devices. In this thesis, the mechanical properties of carbon nanotubes are studied with an Atomic Force Microscope (AFM) and, conversely, the use of CNTs to enhance conventional AFM probes is also investigated. First, the performance of AFM probes with multiwalled CNT tips are evaluated during attractive regime AFM imaging of high aspect ratio structures. The presented experimental results show two distinct imaging artifacts, the divot and large ringing artifacts, which are inherent to such CNT AFM probes. Through the adjustment of operating parameters, the connection of these artifacts to CNT bending, adhesion, and stiction is described qualitatively and explained. Next, the adhesion and peeling of CNTs on different substrates is quantitatively investigated with theoretical models and a new AFM mode for nanomechanical peeling. The theoretical model uncovers the rich physics of peeling of CNTs from surfaces, including sudden transitions between different geometric configurations of the nanotube with vastly different interfacial energies. The experimental peeling of CNTs is shown to be capable of resolving differences in CNT peeling energies at attoJoule levels on different materials. AFM peeling force spectroscopy is further studied on a variety of materials, including several polymers, to demonstrate the capability of direct measurement of interfacial energy between an individual nanotube or nanofiber and a given material surface. Theoretical investigations demonstrate that interfacial and flexural energies can be decoupled so that the work of the applied peeling force can be used to estimate the CNT-substrate interfacial fracture energy and nanotube's flexural stiffness. Hundreds of peeling force experiments on graphite, epoxy, and polyimide demonstrate that the peeling force spectroscopy offers a convenient experimental framework to quickly screen different combinations of polymers and functionalized nanotubes for optimal interfacial strength. Finally, multiple CNT AFM probe oscillation states in tapping mode AFM as the cantilever is brought closer to a sample are fully investigated, including two kinds of permanent contact and two types of intermittent contact. Large deformation continuum elastica models of MWCNTs with different end boundary conditions are used to identify whether the CNT remains anchored to the sample in line-contact or in point-contact in the permanent contact regime. Energy dissipation spectroscopy and phase contrast are demonstrated as a way to predict the state of CNT-substrate boundary condition in the intermittent tapping regime on different substrates and to highlight the implications of these different imaging regimes for critical dimension AFM, biological sensing, and nanolithography. Together, this work studies the effect of CNT mechanical interactions in AFM, including artifact-avoidance optimization of and new compositional mapping using CNT AFM probes as well as novel techniques that will potentially enhance the future development of CNT-based nanodevices and materials.
Automatic removal of eye-movement and blink artifacts from EEG signals.
Gao, Jun Feng; Yang, Yong; Lin, Pan; Wang, Pei; Zheng, Chong Xun
2010-03-01
Frequent occurrence of electrooculography (EOG) artifacts leads to serious problems in interpreting and analyzing the electroencephalogram (EEG). In this paper, a robust method is presented to automatically eliminate eye-movement and eye-blink artifacts from EEG signals. Independent Component Analysis (ICA) is used to decompose EEG signals into independent components. Moreover, the features of topographies and power spectral densities of those components are extracted to identify eye-movement artifact components, and a support vector machine (SVM) classifier is adopted because it has higher performance than several other classifiers. The classification results show that feature-extraction methods are unsuitable for identifying eye-blink artifact components, and then a novel peak detection algorithm of independent component (PDAIC) is proposed to identify eye-blink artifact components. Finally, the artifact removal method proposed here is evaluated by the comparisons of EEG data before and after artifact removal. The results indicate that the method proposed could remove EOG artifacts effectively from EEG signals with little distortion of the underlying brain signals.
Chen, Yang; Budde, Adam; Li, Ke; Li, Yinsheng; Hsieh, Jiang; Chen, Guang-Hong
2017-01-01
When the scan field of view (SFOV) of a CT system is not large enough to enclose the entire cross-section of the patient, or the patient needs to be positioned partially outside the SFOV for certain clinical applications, truncation artifacts often appear in the reconstructed CT images. Many truncation artifact correction methods perform extrapolations of the truncated projection data based on certain a priori assumptions. The purpose of this work was to develop a novel CT truncation artifact reduction method that directly operates on DICOM images. The blooming of pixel values associated with truncation was modeled using exponential decay functions, and based on this model, a discriminative dictionary was constructed to represent truncation artifacts and nonartifact image information in a mutually exclusive way. The discriminative dictionary consists of a truncation artifact subdictionary and a nonartifact subdictionary. The truncation artifact subdictionary contains 1000 atoms with different decay parameters, while the nonartifact subdictionary contains 1000 independent realizations of Gaussian white noise that are exclusive with the artifact features. By sparsely representing an artifact-contaminated CT image with this discriminative dictionary, the image was separated into a truncation artifact-dominated image and a complementary image with reduced truncation artifacts. The artifact-dominated image was then subtracted from the original image with an appropriate weighting coefficient to generate the final image with reduced artifacts. This proposed method was validated via physical phantom studies and retrospective human subject studies. Quantitative image evaluation metrics including the relative root-mean-square error (rRMSE) and the universal image quality index (UQI) were used to quantify the performance of the algorithm. For both phantom and human subject studies, truncation artifacts at the peripheral region of the SFOV were effectively reduced, revealing soft tissue and bony structure once buried in the truncation artifacts. For the phantom study, the proposed method reduced the relative RMSE from 15% (original images) to 11%, and improved the UQI from 0.34 to 0.80. A discriminative dictionary representation method was developed to mitigate CT truncation artifacts directly in the DICOM image domain. Both phantom and human subject studies demonstrated that the proposed method can effectively reduce truncation artifacts without access to projection data. © 2016 American Association of Physicists in Medicine.
Voting strategy for artifact reduction in digital breast tomosynthesis.
Wu, Tao; Moore, Richard H; Kopans, Daniel B
2006-07-01
Artifacts are observed in digital breast tomosynthesis (DBT) reconstructions due to the small number of projections and the narrow angular range that are typically employed in tomosynthesis imaging. In this work, we investigate the reconstruction artifacts that are caused by high-attenuation features in breast and develop several artifact reduction methods based on a "voting strategy." The voting strategy identifies the projection(s) that would introduce artifacts to a voxel and rejects the projection(s) when reconstructing the voxel. Four approaches to the voting strategy were compared, including projection segmentation, maximum contribution deduction, one-step classification, and iterative classification. The projection segmentation method, based on segmentation of high-attenuation features from the projections, effectively reduces artifacts caused by metal and large calcifications that can be reliably detected and segmented from projections. The other three methods are based on the observation that contributions from artifact-inducing projections have higher value than those from normal projections. These methods attempt to identify the projection(s) that would cause artifacts by comparing contributions from different projections. Among the three methods, the iterative classification method provides the best artifact reduction; however, it can generate many false positive classifications that degrade the image quality. The maximum contribution deduction method and one-step classification method both reduce artifacts well from small calcifications, although the performance of artifact reduction is slightly better with the one-step classification. The combination of one-step classification and projection segmentation removes artifacts from both large and small calcifications.
Detection of artifacts from high energy bursts in neonatal EEG.
Bhattacharyya, Sourya; Biswas, Arunava; Mukherjee, Jayanta; Majumdar, Arun Kumar; Majumdar, Bandana; Mukherjee, Suchandra; Singh, Arun Kumar
2013-11-01
Detection of non-cerebral activities or artifacts, intermixed within the background EEG, is essential to discard them from subsequent pattern analysis. The problem is much harder in neonatal EEG, where the background EEG contains spikes, waves, and rapid fluctuations in amplitude and frequency. Existing artifact detection methods are mostly limited to detect only a subset of artifacts such as ocular, muscle or power line artifacts. Few methods integrate different modules, each for detection of one specific category of artifact. Furthermore, most of the reference approaches are implemented and tested on adult EEG recordings. Direct application of those methods on neonatal EEG causes performance deterioration, due to greater pattern variation and inherent complexity. A method for detection of a wide range of artifact categories in neonatal EEG is thus required. At the same time, the method should be specific enough to preserve the background EEG information. The current study describes a feature based classification approach to detect both repetitive (generated from ECG, EMG, pulse, respiration, etc.) and transient (generated from eye blinking, eye movement, patient movement, etc.) artifacts. It focuses on artifact detection within high energy burst patterns, instead of detecting artifacts within the complete background EEG with wide pattern variation. The objective is to find true burst patterns, which can later be used to identify the Burst-Suppression (BS) pattern, which is commonly observed during newborn seizure. Such selective artifact detection is proven to be more sensitive to artifacts and specific to bursts, compared to the existing artifact detection approaches applied on the complete background EEG. Several time domain, frequency domain, statistical features, and features generated by wavelet decomposition are analyzed to model the proposed bi-classification between burst and artifact segments. A feature selection method is also applied to select the feature subset producing highest classification accuracy. The suggested feature based classification method is executed using our recorded neonatal EEG dataset, consisting of burst and artifact segments. We obtain 78% sensitivity and 72% specificity as the accuracy measures. The accuracy obtained using the proposed method is found to be about 20% higher than that of the reference approaches. Joint use of the proposed method with our previous work on burst detection outperforms reference methods on simultaneous burst and artifact detection. As the proposed method supports detection of a wide range of artifact patterns, it can be improved to incorporate the detection of artifacts within other seizure patterns and background EEG information as well. © 2013 Elsevier Ltd. All rights reserved.
Wojnarowicz, Mark W.; Fisher, Andrew M.; Minaeva, Olga; Goldstein, Lee E.
2017-01-01
Animal models of concussion, traumatic brain injury (TBI), and chronic traumatic encephalopathy (CTE) are widely available and routinely deployed in laboratories around the world. Effective animal modeling requires careful consideration of four basic principles. First, animal model use must be guided by clarity of definitions regarding the human disease or condition being modeled. Concussion, TBI, and CTE represent distinct clinical entities that require clear differentiation: concussion is a neurological syndrome, TBI is a neurological event, and CTE is a neurological disease. While these conditions are all associated with head injury, the pathophysiology, clinical course, and medical management of each are distinct. Investigators who use animal models of these conditions must take into account these clinical distinctions to avoid misinterpretation of results and category mistakes. Second, model selection must be grounded by clarity of purpose with respect to experimental questions and frame of reference of the investigation. Distinguishing injury context (“inputs”) from injury consequences (“outputs”) may be helpful during animal model selection, experimental design and execution, and interpretation of results. Vigilance is required to rout out, or rigorously control for, model artifacts with potential to interfere with primary endpoints. The widespread use of anesthetics in many animal models illustrates the many ways that model artifacts can confound preclinical results. Third, concordance between key features of the animal model and the human disease or condition being modeled is required to confirm model biofidelity. Fourth, experimental results observed in animals must be confirmed in human subjects for model validation. Adherence to these principles serves as a bulwark against flawed interpretation of results, study replication failure, and confusion in the field. Implementing these principles will advance basic science discovery and accelerate clinical translation to benefit people affected by concussion, TBI, and CTE. PMID:28620350
Blue Mountain Lake; An Archeological Survey and an Experimental Study of Inundation Impacts.
1978-02-01
Seasonal exploitation of available plant and animal foods , with scheduled occupation of sites reflecting an annual cycle, is often detectable in the...Archeological Significance: Shell is found in many archeological sites, both as artifacts (culturally modified) and as food residue. Quantification of...samples. The vessels (one simal bottle of Tiboriginal manufact ire with the neck broken off and one large jar manufactured in 1973 using aboriginal
An Examination of the True Reliability of Lower Limb Stiffness Measures During Overground Hopping.
Diggin, David; Anderson, Ross; Harrison, Andrew J
2016-06-01
Evidence suggests reports describing the reliability of leg-spring (kleg) and joint stiffness (kjoint) measures are contaminated by artifacts originating from digital filtering procedures. In addition, the intraday reliability of kleg and kjoint requires investigation. This study examined the effects of experimental procedures on the inter- and intraday reliability of kleg and kjoint. Thirty-two participants completed 2 trials of single-legged hopping at 1.5, 2.2, and 3.0 Hz at the same time of day across 3 days. On the final test day a fourth experimental bout took place 6 hours before or after participants' typical testing time. Kinematic and kinetic data were collected throughout. Stiffness was calculated using models of kleg and kjoint. Classifications of measurement agreement were established using thresholds for absolute and relative reliability statistics. Results illustrated that kleg and kankle exhibited strong agreement. In contrast, kknee and khip demonstrated weak-to-moderate consistency. Results suggest limits in kjoint reliability persist despite employment of appropriate filtering procedures. Furthermore, diurnal fluctuations in lower-limb muscle-tendon stiffness exhibit little effect on intraday reliability. The present findings support the existence of kleg as an attractor state during hopping, achieved through fluctuations in kjoint variables. Limits to kjoint reliability appear to represent biological function rather than measurement artifact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, P; Schreibmann, E; Fox, T
2014-06-15
Purpose: Severe CT artifacts can impair our ability to accurately calculate proton range thereby resulting in a clinically unacceptable treatment plan. In this work, we investigated a novel CT artifact correction method based on a coregistered MRI and investigated its ability to estimate CT HU and proton range in the presence of severe CT artifacts. Methods: The proposed method corrects corrupted CT data using a coregistered MRI to guide the mapping of CT values from a nearby artifact-free region. First patient MRI and CT images were registered using 3D deformable image registration software based on B-spline and mutual information. Themore » CT slice with severe artifacts was selected as well as a nearby slice free of artifacts (e.g. 1cm away from the artifact). The two sets of paired MRI and CT images at different slice locations were further registered by applying 2D deformable image registration. Based on the artifact free paired MRI and CT images, a comprehensive geospatial analysis was performed to predict the correct CT HU of the CT image with severe artifact. For a proof of concept, a known artifact was introduced that changed the ground truth CT HU value up to 30% and up to 5cm error in proton range. The ability of the proposed method to recover the ground truth was quantified using a selected head and neck case. Results: A significant improvement in image quality was observed visually. Our proof of concept study showed that 90% of area that had 30% errors in CT HU was corrected to 3% of its ground truth value. Furthermore, the maximum proton range error up to 5cm was reduced to 4mm error. Conclusion: MRI based CT artifact correction method can improve CT image quality and proton range calculation for patients with severe CT artifacts.« less
Fang, Jieming; Zhang, Da; Wilcox, Carol; Heidinger, Benedikt; Raptopoulos, Vassilios; Brook, Alexander; Brook, Olga R
2017-03-01
To assess single energy metal artifact reduction (SEMAR) and spectral energy metal artifact reduction (MARS) algorithms in reducing artifacts generated by different metal implants. Phantom was scanned with and without SEMAR (Aquilion One, Toshiba) and MARS (Discovery CT750 HD, GE), with various metal implants. Images were evaluated objectively by measuring standard deviation in regions of interests and subjectively by two independent reviewers grading on a scale of 0 (no artifact) to 4 (severe artifact). Reviewers also graded new artifacts introduced by metal artifact reduction algorithms. SEMAR and MARS significantly decreased variability of the density measurement adjacent to the metal implant, with median SD (standard deviation of density measurement) of 52.1 HU without SEMAR, vs. 12.3 HU with SEMAR, p < 0.001. Median SD without MARS of 63.1 HU decreased to 25.9 HU with MARS, p < 0.001. Median SD with SEMAR is significantly lower than median SD with MARS (p = 0.0011). SEMAR improved subjective image quality with reduction in overall artifacts grading from 3.2 ± 0.7 to 1.4 ± 0.9, p < 0.001. Improvement of overall image quality by MARS has not reached statistical significance (3.2 ± 0.6 to 2.6 ± 0.8, p = 0.088). There was a significant introduction of artifacts introduced by metal artifact reduction algorithm for MARS with 2.4 ± 1.0, but minimal with SEMAR 0.4 ± 0.7, p < 0.001. CT iterative reconstruction algorithms with single and spectral energy are both effective in reduction of metal artifacts. Single energy-based algorithm provides better overall image quality than spectral CT-based algorithm. Spectral metal artifact reduction algorithm introduces mild to moderate artifacts in the far field.
Reference-Free Removal of EEG-fMRI Ballistocardiogram Artifacts with Harmonic Regression
Krishnaswamy, Pavitra; Bonmassar, Giorgio; Poulsen, Catherine; Pierce, Eric T; Purdon, Patrick L.; Brown, Emery N.
2016-01-01
Combining electroencephalogram (EEG) recording and functional magnetic resonance imaging (fMRI) offers the potential for imaging brain activity with high spatial and temporal resolution. This potential remains limited by the significant ballistocardiogram (BCG) artifacts induced in the EEG by cardiac pulsation-related head movement within the magnetic field. We model the BCG artifact using a harmonic basis, pose the artifact removal problem as a local harmonic regression analysis, and develop an efficient maximum likelihood algorithm to estimate and remove BCG artifacts. Our analysis paradigm accounts for time-frequency overlap between the BCG artifacts and neurophysiologic EEG signals, and tracks the spatiotemporal variations in both the artifact and the signal. We evaluate performance on: simulated oscillatory and evoked responses constructed with realistic artifacts; actual anesthesia-induced oscillatory recordings; and actual visual evoked potential recordings. In each case, the local harmonic regression analysis effectively removes the BCG artifacts, and recovers the neurophysiologic EEG signals. We further show that our algorithm outperforms commonly used reference-based and component analysis techniques, particularly in low SNR conditions, the presence of significant time-frequency overlap between the artifact and the signal, and/or large spatiotemporal variations in the BCG. Because our algorithm does not require reference signals and has low computational complexity, it offers a practical tool for removing BCG artifacts from EEG data recorded in combination with fMRI. PMID:26151100
Kandala, Sridhar; Nolan, Dan; Laumann, Timothy O.; Power, Jonathan D.; Adeyemo, Babatunde; Harms, Michael P.; Petersen, Steven E.; Barch, Deanna M.
2016-01-01
Abstract Like all resting-state functional connectivity data, the data from the Human Connectome Project (HCP) are adversely affected by structured noise artifacts arising from head motion and physiological processes. Functional connectivity estimates (Pearson's correlation coefficients) were inflated for high-motion time points and for high-motion participants. This inflation occurred across the brain, suggesting the presence of globally distributed artifacts. The degree of inflation was further increased for connections between nearby regions compared with distant regions, suggesting the presence of distance-dependent spatially specific artifacts. We evaluated several denoising methods: censoring high-motion time points, motion regression, the FMRIB independent component analysis-based X-noiseifier (FIX), and mean grayordinate time series regression (MGTR; as a proxy for global signal regression). The results suggest that FIX denoising reduced both types of artifacts, but left substantial global artifacts behind. MGTR significantly reduced global artifacts, but left substantial spatially specific artifacts behind. Censoring high-motion time points resulted in a small reduction of distance-dependent and global artifacts, eliminating neither type. All denoising strategies left differences between high- and low-motion participants, but only MGTR substantially reduced those differences. Ultimately, functional connectivity estimates from HCP data showed spatially specific and globally distributed artifacts, and the most effective approach to address both types of motion-correlated artifacts was a combination of FIX and MGTR. PMID:27571276
NASA Astrophysics Data System (ADS)
Kuniyil Ajith Singh, Mithun; Jaeger, Michael; Frenz, Martin; Steenbergen, Wiendelt
2016-03-01
Reflection artifacts caused by acoustic inhomogeneities are a main challenge to deep-tissue photoacoustic imaging. Photoacoustic transients generated by the skin surface and superficial vasculature will propagate into the tissue and reflect back from echogenic structures to generate reflection artifacts. These artifacts can cause problems in image interpretation and limit imaging depth. In its basic version, PAFUSion mimics the inward travelling wave-field from blood vessel-like PA sources by applying focused ultrasound pulses, and thus provides a way to identify reflection artifacts. In this work, we demonstrate reflection artifact correction in addition to identification, towards obtaining an artifact-free photoacoustic image. In view of clinical applications, we implemented an improved version of PAFUSion in which photoacoustic data is backpropagated to imitate the inward travelling wave-field and thus the reflection artifacts of a more arbitrary distribution of PA sources that also includes the skin melanin layer. The backpropagation is performed in a synthetic way based on the pulse-echo acquisitions after transmission on each single element of the transducer array. We present a phantom experiment and initial in vivo measurements on human volunteers where we demonstrate significant reflection artifact reduction using our technique. The results provide a direct confirmation that reflection artifacts are prominent in clinical epi-photoacoustic imaging, and that PAFUSion can reduce these artifacts significantly to improve the deep-tissue photoacoustic imaging.
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images.
Lawhern, Vernon; Hairston, W David; McDowell, Kaleb; Westerfield, Marissa; Robbins, Kay
2012-07-15
We examine the problem of accurate detection and classification of artifacts in continuous EEG recordings. Manual identification of artifacts, by means of an expert or panel of experts, can be tedious, time-consuming and infeasible for large datasets. We use autoregressive (AR) models for feature extraction and characterization of EEG signals containing several kinds of subject-generated artifacts. AR model parameters are scale-invariant features that can be used to develop models of artifacts across a population. We use a support vector machine (SVM) classifier to discriminate among artifact conditions using the AR model parameters as features. Results indicate reliable classification among several different artifact conditions across subjects (approximately 94%). These results suggest that AR modeling can be a useful tool for discriminating among artifact signals both within and across individuals. Copyright © 2012 Elsevier B.V. All rights reserved.
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Background: Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. Methods: In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Results: Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Conclusions: Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images. PMID:29535920
Effect of pressure and padding on motion artifact of textile electrodes.
Cömert, Alper; Honkala, Markku; Hyttinen, Jari
2013-04-08
With the aging population and rising healthcare costs, wearable monitoring is gaining importance. The motion artifact affecting dry electrodes is one of the main challenges preventing the widespread use of wearable monitoring systems. In this paper we investigate the motion artifact and ways of making a textile electrode more resilient against motion artifact. Our aim is to study the effects of the pressure exerted onto the electrode, and the effects of inserting padding between the applied pressure and the electrode. We measure real time electrode-skin interface impedance, ECG from two channels, the motion artifact related surface potential, and exerted pressure during controlled motion by a measurement setup designed to estimate the relation of motion artifact to the signals. We use different foam padding materials with various mechanical properties and apply electrode pressures between 5 and 25 mmHg to understand their effect. A QRS and noise detection algorithm based on a modified Pan-Tompkins QRS detection algorithm estimates the electrode behaviour in respect to the motion artifact from two channels; one dominated by the motion artifact and one containing both the motion artifact and the ECG. This procedure enables us to quantify a given setup's susceptibility to the motion artifact. Pressure is found to strongly affect signal quality as is the use of padding. In general, the paddings reduce the motion artifact. However the shape and frequency components of the motion artifact vary for different paddings, and their material and physical properties. Electrode impedance at 100 kHz correlates in some cases with the motion artifact but it is not a good predictor of the motion artifact. From the results of this study, guidelines for improving electrode design regarding padding and pressure can be formulated as paddings are a necessary part of the system for reducing the motion artifact, and further, their effect maximises between 15 mmHg and 20 mmHg of exerted pressure. In addition, we present new methods for evaluating electrode sensitivity to motion, utilizing the detection of noise peaks that fall into the same frequency band as R-peaks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, K; Kuo, H; Ritter, J
Purpose: To evaluate the feasibility of using a metal artifact reduction technique in depleting metal artifact and its application in improving dose calculation in External Radiation Therapy Planning. Methods: CIRS electron density phantom was scanned with and without steel drill bits placed in some plug holes. Meta artifact reduction software with Metal Deletion Technique (MDT) was used to remove metal artifacts for scanned image with metal. Hounsfield units of electron density plugs from artifact free reference image and MDT processed images were compared. To test the dose calculation improvement after the MDT processed images, clinically approved head and neck planmore » with manual dental artifact correction was tested. Patient images were exported and processed with MDT and plan was recalculated with new MDT image without manual correction. Dose profiles near the metal artifacts were compared. Results: The MDT used in this study effectively reduced the metal artifact caused by beam hardening and scatter. The windmill around the metal drill was greatly improved with smooth rounded view. Difference of the mean HU in each density plug between reference and MDT images were less than 10 HU in most of the plugs. Dose difference between original plan and MDT images were minimal. Conclusion: Most metal artifact reduction methods were developed for diagnostic improvement purpose. Hence Hounsfield unit accuracy was not rigorously tested before. In our test, MDT effectively eliminated metal artifacts with good HU reproduciblity. However, it can introduce new mild artifacts so the MDT images should be checked with original images.« less
An extension to artifact-free projection overlaps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jianyu, E-mail: jianyulin@hotmail.com
2015-05-15
Purpose: In multipinhole single photon emission computed tomography, the overlapping of projections has been used to increase sensitivity. Avoiding artifacts in the reconstructed image associated with projection overlaps (multiplexing) is a critical issue. In our previous report, two types of artifact-free projection overlaps, i.e., projection overlaps that do not lead to artifacts in the reconstructed image, were formally defined and proved, and were validated via simulations. In this work, a new proposition is introduced to extend the previously defined type-II artifact-free projection overlaps so that a broader range of artifact-free overlaps is accommodated. One practical purpose of the new extensionmore » is to design a baffle window multipinhole system with artifact-free projection overlaps. Methods: First, the extended type-II artifact-free overlap was theoretically defined and proved. The new proposition accommodates the situation where the extended type-II artifact-free projection overlaps can be produced with incorrectly reconstructed portions in the reconstructed image. Next, to validate the theory, the extended-type-II artifact-free overlaps were employed in designing the multiplexing multipinhole spiral orbit imaging systems with a baffle window. Numerical validations were performed via simulations, where the corresponding 1-pinhole nonmultiplexing reconstruction results were used as the benchmark for artifact-free reconstructions. The mean square error (MSE) was the metric used for comparisons of noise-free reconstructed images. Noisy reconstructions were also performed as part of the validations. Results: Simulation results show that for noise-free reconstructions, the MSEs of the reconstructed images of the artifact-free multiplexing systems are very similar to those of the corresponding 1-pinhole systems. No artifacts were observed in the reconstructed images. Therefore, the testing results for artifact-free multiplexing systems designed using the extended type-II artifact-free overlaps numerically validated the developed theory. Conclusions: First, the extension itself is of theoretical importance because it broadens the selection range for optimizing multiplexing multipinhole designs. Second, the extension has an immediate application: using a baffle window to design a special spiral orbit multipinhole imaging system with projection overlaps in the orbit axial direction. Such an artifact-free baffle window design makes it possible for us to image any axial portion of interest of a long object with projection overlaps to increase sensitivity.« less
Noury, Nima; Hipp, Joerg F; Siegel, Markus
2016-10-15
Transcranial electric stimulation (tES) is a promising tool to non-invasively manipulate neuronal activity in the human brain. Several studies have shown behavioral effects of tES, but stimulation artifacts complicate the simultaneous investigation of neural activity with EEG or MEG. Here, we first show for EEG and MEG, that contrary to previous assumptions, artifacts do not simply reflect stimulation currents, but that heartbeat and respiration non-linearly modulate stimulation artifacts. These modulations occur irrespective of the stimulation frequency, i.e. during both transcranial alternating and direct current stimulations (tACS and tDCS). Second, we show that, although at first sight previously employed artifact rejection methods may seem to remove artifacts, data are still contaminated by non-linear stimulation artifacts. Because of their complex nature and dependence on the subjects' physiological state, these artifacts are prone to be mistaken as neural entrainment. In sum, our results uncover non-linear tES artifacts, show that current techniques fail to fully remove them, and pave the way for new artifact rejection methods. Copyright © 2016 Elsevier Inc. All rights reserved.
WE-G-209-00: Identifying Image Artifacts, Their Causes, and How to Fix Them
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
Pictorial Review of Digital Radiography Artifacts.
Walz-Flannigan, Alisa I; Brossoit, Kimberly J; Magnuson, Dayne J; Schueler, Beth A
2018-01-01
Visual familiarity with the variety of digital radiographic artifacts is needed to identify, resolve, or prevent image artifacts from creating issues with patient imaging. Because the mechanism for image creation is different between flat-panel detectors and computed radiography, the causes and appearances of some artifacts can be unique to these different modalities. Examples are provided of artifacts that were found on clinical images or during quality control testing with flat-panel detectors. The examples are meant to serve as learning tools for future identification and troubleshooting of artifacts and as a reminder for steps that can be taken for prevention. The examples of artifacts provided are classified according to their causal connection in the imaging chain, including an equipment defect as a result of an accident or mishandling, debris or gain calibration flaws, a problematic acquisition technique, signal transmission failures, and image processing issues. Specific artifacts include those that are due to flat-panel detector drops, backscatter, debris in the x-ray field during calibration, detector saturation or underexposure, or collimation detection errors, as well as a variety of artifacts that are processing induced. © RSNA, 2018.
Reduction of metal artifacts from alloy hip prostheses in computer tomography.
Wang, Fengdan; Xue, Huadan; Yang, Xianda; Han, Wei; Qi, Bing; Fan, Yu; Qian, Wenwei; Wu, Zhihong; Zhang, Yan; Jin, Zhengyu
2014-01-01
The objective of this study was to evaluate the feasibility of reducing artifacts from large metal implants with gemstone spectral imaging (GSI) and metal artifact reduction software (MARS). Twenty-three in-vivo cobalt-chromium-molybdenum alloy total hip prostheses were prospectively scanned by fast kV-switching GSI between 80 and 140 kVp. The computed tomography images were reconstructed with monochromatic energy and with/without MARS. Both subjective and objective measurements were performed to assess the severity of metal artifacts. Increasing photon energy was associated with reduced metal artifacts in GSI images (P < 0.001). Combination of GSI with MARS further diminished the metal artifacts (P < 0.001). Artifact reduction at 3 anatomical levels (femoral head, neck, and shaft) were evaluated, with data showing that GSI and MARS could reduce metal artifacts at all 3 levels (P = 0.011, P < 0.001, and P = 0.003, respectively). Nevertheless, in certain cases, GSI without MARS produced more realistic images for the clinical situation. Proper usage of GSI with/without MARS could reduce the computed tomography artifacts of large metal parts and improve the radiological evaluation of postarthroplasty patients.
Pulsar-aided SETI experimental observations
NASA Technical Reports Server (NTRS)
Heidmann, J.; Biraud, F.; Tarter, J.
1989-01-01
The rotational frequencies of pulsars are used to select preferred radio frequencies for SETI. Pulsar rotational frequencies are converted into SETI frequencies in the 1-10 GHz Galactic radio window. Experimental observations using the frequencies are conducted for target stars closer than 25 parsecs, unknown targets in a globular cluster, and unknown targets in the Galaxy closer than 2.5 kpc. The status of these observations is discussed.
Ketelsen, D; Werner, M K; Thomas, C; Tsiflikas, I; Koitschev, A; Reimann, A; Claussen, C D; Heuschmid, M
2009-01-01
Important oropharyngeal structures can be superimposed by metallic artifacts due to dental implants. The aim of this study was to compare the image quality of multiplanar reconstructions and an angulated spiral in dual-source computed tomography (DSCT) of the neck. Sixty-two patients were included for neck imaging with DSCT. MPRs from an axial dataset and an additional short spiral parallel to the mouth floor were acquired. Leading anatomical structures were then evaluated with respect to the extent to which they were affected by dental artifacts using a visual scale, ranging from 1 (least artifacts) to 4 (most artifacts). In MPR, 87.1 % of anatomical structures had significant artifacts (3.12 +/- 0.86), while in angulated slices leading anatomical structures of the oropharynx showed negligible artifacts (1.28 +/- 0.46). The diagnostic growth due to primarily angulated slices concerning artifact severity was significant (p < 0.01). MPRs are not capable of reducing dental artifacts sufficiently. In patients with dental artifacts overlying the anatomical structures of the oropharynx, an additional short angulated spiral parallel to the floor of the mouth is recommended and should be applied for daily routine. As a result of the static gantry design of DSCT, the use of a flexible head holder is essential.
Discriminative Ocular Artifact Correction for Feature Learning in EEG Analysis.
Xinyang Li; Cuntai Guan; Haihong Zhang; Kai Keng Ang
2017-08-01
Electrooculogram (EOG) artifact contamination is a common critical issue in general electroencephalogram (EEG) studies as well as in brain-computer interface (BCI) research. It is especially challenging when dedicated EOG channels are unavailable or when there are very few EEG channels available for independent component analysis based ocular artifact removal. It is even more challenging to avoid loss of the signal of interest during the artifact correction process, where the signal of interest can be multiple magnitudes weaker than the artifact. To address these issues, we propose a novel discriminative ocular artifact correction approach for feature learning in EEG analysis. Without extra ocular movement measurements, the artifact is extracted from raw EEG data, which is totally automatic and requires no visual inspection of artifacts. Then, artifact correction is optimized jointly with feature extraction by maximizing oscillatory correlations between trials from the same class and minimizing them between trials from different classes. We evaluate this approach on a real-world EEG dataset comprising 68 subjects performing cognitive tasks. The results showed that the approach is capable of not only suppressing the artifact components but also improving the discriminative power of a classifier with statistical significance. We also demonstrate that the proposed method addresses the confounding issues induced by ocular movements in cognitive EEG study.
Metallic artifact in MRI after removal of orthopedic implants.
Bagheri, Mohammad Hadi; Hosseini, Mehrdad Mohammad; Emami, Mohammad Jafar; Foroughi, Amin Aiboulhassani
2012-03-01
The aim of the present study was to evaluate the metallic artifacts in MRI of the orthopedic patients after removal of metallic implants. From March to August 2009, 40 orthopedic patients operated for removal of orthopedic metallic implants were studied by post-operative MRI from the site of removal of implants. A grading scale of 0-3 was assigned for artifact in MR images whereby 0 was considered no artifact; and I-III were considered mild, moderate, and severe metallic artifacts, respectively. These grading records were correlated with other variables including the type, size, number, and composition of metallic devices; and the site and duration of orthopedic devices stay in the body. Metallic susceptibly artifacts were detected in MRI of 18 of 40 cases (45%). Screws and pins in removed hardware were the most important factors for causing artifacts in MRI. The artifacts were found more frequently in the patients who had more screws and pins in the removed implants. Gender, age, site of implantation of the device, length of the hardware, composition of the metallic implants (stainless steel versus titanium), and duration of implantation of the hardware exerted no effect in producing metallic artifacts after removal of implants. Short TE sequences of MRI (such as T1 weighted) showed fewer artifacts. Susceptibility of metallic artifacts is a frequent phenomenon in MRI of patients upon removal of metallic orthopedic implants. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Wu, Wenchuan; Fang, Sheng; Guo, Hua
2014-06-01
Aiming at motion artifacts and off-resonance artifacts in multi-shot diffusion magnetic resonance imaging (MRI), we proposed a joint correction method in this paper to correct the two kinds of artifacts simultaneously without additional acquisition of navigation data and field map. We utilized the proposed method using multi-shot variable density spiral sequence to acquire MRI data and used auto-focusing technique for image deblurring. We also used direct method or iterative method to correct motion induced phase errors in the process of deblurring. In vivo MRI experiments demonstrated that the proposed method could effectively suppress motion artifacts and off-resonance artifacts and achieve images with fine structures. In addition, the scan time was not increased in applying the proposed method.
Landsat TM memory effect characterization and correction
Helder, D.; Boncyk, W.; Morfitt, R.
1997-01-01
Before radiometric calibration of Landsat Thematic Mapper (TM) data can be done accurately, it is necessary to minimize the effects of artifacts present in the data that originate in the instrument's signal processing path. These artifacts have been observed in downlinked image data since shortly after launch of Landsat 4 and 5. However, no comprehensive work has been done to characterize all the artifacts and develop methods for their correction. In this paper, the most problematic artifact is discussed: memory effect (ME). Characterization of this artifact is presented, including the parameters necessary for its correction. In addition, a correction algorithm is described that removes the artifact from TM imagery. It will be shown that this artifact causes significant radiometry errors, but the effect can be removed in a straightforward manner.
A dual-view digital tomosynthesis imaging technique for improved chest imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhong, Yuncheng; Lai, Chao-Jen; Wang, Tianpeng
Purpose: Digital tomosynthesis (DTS) has been shown to be useful for reducing the overlapping of abnormalities with anatomical structures at various depth levels along the posterior–anterior (PA) direction in chest radiography. However, DTS provides crude three-dimensional (3D) images that have poor resolution in the lateral view and can only be displayed with reasonable quality in the PA view. Furthermore, the spillover of high-contrast objects from off-fulcrum planes generates artifacts that may impede the diagnostic use of the DTS images. In this paper, the authors describe and demonstrate the use of a dual-view DTS technique to improve the accuracy of themore » reconstructed volume image data for more accurate rendition of the anatomy and slice images with improved resolution and reduced artifacts, thus allowing the 3D image data to be viewed in views other than the PA view. Methods: With the dual-view DTS technique, limited angle scans are performed and projection images are acquired in two orthogonal views: PA and lateral. The dual-view projection data are used together to reconstruct 3D images using the maximum likelihood expectation maximization iterative algorithm. In this study, projection images were simulated or experimentally acquired over 360° using the scanning geometry for cone beam computed tomography (CBCT). While all projections were used to reconstruct CBCT images, selected projections were extracted and used to reconstruct single- and dual-view DTS images for comparison with the CBCT images. For realistic demonstration and comparison, a digital chest phantom derived from clinical CT images was used for the simulation study. An anthropomorphic chest phantom was imaged for the experimental study. The resultant dual-view DTS images were visually compared with the single-view DTS images and CBCT images for the presence of image artifacts and accuracy of CT numbers and anatomy and quantitatively compared with root-mean-square-deviation (RMSD) values computed using the digital chest phantom or the CBCT images as the reference in the simulation and experimental study, respectively. High-contrast wires with vertical, oblique, and horizontal orientations in a PA view plane were also imaged to investigate the spatial resolutions and how the wire signals spread in the PA view and lateral view slice images. Results: Both the digital phantom images (simulated) and the anthropomorphic phantom images (experimentally generated) demonstrated that the dual-view DTS technique resulted in improved spatial resolution in the depth (PA) direction, more accurate representation of the anatomy, and significantly reduced artifacts. The RMSD values corroborate well with visual observations with substantially lower RMSD values measured for the dual-view DTS images as compared to those measured for the single-view DTS images. The imaging experiment with the high-contrast wires shows that while the vertical and oblique wires could be resolved in the lateral view in both single- and dual-view DTS images, the horizontal wire could only be resolved in the dual-view DTS images. This indicates that with single-view DTS, the wire signals spread liberally to off-fulcrum planes and generated wire shadow there. Conclusions: The authors have demonstrated both visually and quantitatively that the dual-view DTS technique can be used to achieve more accurate rendition of the anatomy and to obtain slice images with improved resolution and reduced artifacts as compared to the single-view DTS technique, thus allowing the 3D image data to be viewed in views other than the PA view. These advantages could make the dual-view DTS technique useful in situations where better separation of the objects-of-interest from the off-fulcrum structures or more accurate 3D rendition of the anatomy are required while a regular CT examination is undesirable due to radiation dose considerations.« less
NASA Astrophysics Data System (ADS)
Deprez, Hanne; Gransier, Robin; Hofmann, Michael; van Wieringen, Astrid; Wouters, Jan; Moonen, Marc
2018-02-01
Objective. Electrically evoked auditory steady-state responses (EASSRs) are potentially useful for objective cochlear implant (CI) fitting and follow-up of the auditory maturation in infants and children with a CI. EASSRs are recorded in the electro-encephalogram (EEG) in response to electrical stimulation with continuous pulse trains, and are distorted by significant CI artifacts related to this electrical stimulation. The aim of this study is to evaluate a CI artifacts attenuation method based on independent component analysis (ICA) for three EASSR datasets. Approach. ICA has often been used to remove CI artifacts from the EEG to record transient auditory responses, such as cortical evoked auditory potentials. Independent components (ICs) corresponding to CI artifacts are then often manually identified. In this study, an ICA based CI artifacts attenuation method was developed and evaluated for EASSR measurements with varying CI artifacts and EASSR characteristics. Artifactual ICs were automatically identified based on their spectrum. Main results. For 40 Hz amplitude modulation (AM) stimulation at comfort level, in high SNR recordings, ICA succeeded in removing CI artifacts from all recording channels, without distorting the EASSR. For lower SNR recordings, with 40 Hz AM stimulation at lower levels, or 90 Hz AM stimulation, ICA either distorted the EASSR or could not remove all CI artifacts in most subjects, except for two of the seven subjects tested with low level 40 Hz AM stimulation. Noise levels were reduced after ICA was applied, and up to 29 ICs were rejected, suggesting poor ICA separation quality. Significance. We hypothesize that ICA is capable of separating CI artifacts and EASSR in case the contralateral hemisphere is EASSR dominated. For small EASSRs or large CI artifact amplitudes, ICA separation quality is insufficient to ensure complete CI artifacts attenuation without EASSR distortion.
Reconstruction artifacts in VRX CT scanner images
NASA Astrophysics Data System (ADS)
Rendon, David A.; DiBianca, Frank A.; Keyes, Gary S.
2008-03-01
Variable Resolution X-ray (VRX) CT scanners allow imaging of different sized anatomy at the same level of detail using the same device. This is achieved by tilting the x-ray detectors so that the projected size of the detecting elements is varied to produce reconstructions of smaller fields of view with higher spatial resolution. As with regular CT scanners, the images obtained with VRX scanners are affected by different kinds of artifacts of various origins. This work studies some of these artifacts and the impact that the VRX effect has on them. For this, computational models of single-arm single-slice VRX scanners are used to produce images with artifacts commonly found in routine use. These images and artifacts are produced using our VRX CT scanner simulator, which allows us to isolate the system parameters that have a greater effect on the artifacts. A study of the behavior of the artifacts at varying VRX opening angles is presented for scanners implemented using two different detectors. The results show that, although varying the VRX angle will have an effect on the severity of each of the artifacts studied, for some of these artifacts the effect of other factors (such as the distribution of the detector cells and the position of the phantom in the reconstruction grid) is overwhelmingly more significant. This is shown to be the case for streak artifacts produced by thin metallic objects. For some artifacts related to beam hardening, their severity was found to decrease along with the VRX angle. These observations allow us to infer that in regular use the effect of the VRX angle artifacts similar to the ones studied here will not be noticeable as it will be overshadowed by parameters that cannot be easily controlled outside of a computational model.
Barker, Jeffrey W.; Rosso, Andrea L.; Sparto, Patrick J.; Huppert, Theodore J.
2016-01-01
Abstract. Functional near-infrared spectroscopy (fNIRS) is a relatively low-cost, portable, noninvasive neuroimaging technique for measuring task-evoked hemodynamic changes in the brain. Because fNIRS can be applied to a wide range of populations, such as children or infants, and under a variety of study conditions, including those involving physical movement, gait, or balance, fNIRS data are often confounded by motion artifacts. Furthermore, the high sampling rate of fNIRS leads to high temporal autocorrelation due to systemic physiology. These two factors can reduce the sensitivity and specificity of detecting hemodynamic changes. In a previous work, we showed that these factors could be mitigated by autoregressive-based prewhitening followed by the application of an iterative reweighted least squares algorithm offline. This current work extends these same ideas to real-time analysis of brain signals by modifying the linear Kalman filter, resulting in an algorithm for online estimation that is robust to systemic physiology and motion artifacts. We evaluated the performance of the proposed method via simulations of evoked hemodynamics that were added to experimental resting-state data, which provided realistic fNIRS noise. Last, we applied the method post hoc to data from a standing balance task. Overall, the new method showed good agreement with the analogous offline algorithm, in which both methods outperformed ordinary least squares methods. PMID:27226974
Widjaja, Effendi; Tan, Boon Hong; Garland, Marc
2006-03-01
Two-dimensional (2D) correlation spectroscopy has been extensively applied to analyze various vibrational spectroscopic data, especially infrared and Raman. However, when it is applied to real-world experimental data, which often contains various imperfections (such as noise interference, baseline fluctuations, and band-shifting) and highly overlapping bands, many artifacts and misleading features in synchronous and asynchronous maps will emerge, and this will lead to difficulties with interpretation. Therefore, an approach that counters many artifacts and therefore leads to simplified interpretation of 2D correlation analysis is certainly useful. In the present contribution, band-target entropy minimization (BTEM) is employed as a spectral pretreatment to handle many of the artifact problems before the application of 2D correlation analysis. BTEM is employed to elucidate the pure component spectra of mixtures and their corresponding concentration profiles. Two alternate forms of analysis result. In the first, the normally vxv problem is converted to an equivalent nvxnv problem, where n represents the number of species present. In the second, the pure component spectra are transformed into simple distributions, and an equivalent and less computationally intensive nv'xnv' problem results (v'
Region-based multifocus image fusion for the precise acquisition of Pap smear images.
Tello-Mijares, Santiago; Bescós, Jesús
2018-05-01
A multifocus image fusion method to obtain a single focused image from a sequence of microscopic high-magnification Papanicolau source (Pap smear) images is presented. These images, captured each in a different position of the microscope lens, frequently show partially focused cells or parts of cells, which makes them unpractical for the direct application of image analysis techniques. The proposed method obtains a focused image with a high preservation of original pixels information while achieving a negligible visibility of the fusion artifacts. The method starts by identifying the best-focused image of the sequence; then, it performs a mean-shift segmentation over this image; the focus level of the segmented regions is evaluated in all the images of the sequence, and best-focused regions are merged in a single combined image; finally, this image is processed with an adaptive artifact removal process. The combination of a region-oriented approach, instead of block-based approaches, and a minimum modification of the value of focused pixels in the original images achieve a highly contrasted image with no visible artifacts, which makes this method especially convenient for the medical imaging domain. The proposed method is compared with several state-of-the-art alternatives over a representative dataset. The experimental results show that our proposal obtains the best and more stable quality indicators. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Signal processing in urodynamics: towards high definition urethral pressure profilometry.
Klünder, Mario; Sawodny, Oliver; Amend, Bastian; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Stenzl, Arnulf; Feuer, Ronny
2016-03-22
Urethral pressure profilometry (UPP) is used in the diagnosis of stress urinary incontinence (SUI) which is a significant medical, social, and economic problem. Low spatial pressure resolution, common occurrence of artifacts, and uncertainties in data location limit the diagnostic value of UPP. To overcome these limitations, high definition urethral pressure profilometry (HD-UPP) combining enhanced UPP hardware and signal processing algorithms has been developed. In this work, we present the different signal processing steps in HD-UPP and show experimental results from female minipigs. We use a special microtip catheter with high angular pressure resolution and an integrated inclination sensor. Signals from the catheter are filtered and time-correlated artifacts removed. A signal reconstruction algorithm processes pressure data into a detailed pressure image on the urethra's inside. Finally, the pressure distribution on the urethra's outside is calculated through deconvolution. A mathematical model of the urethra is contained in a point-spread-function (PSF) which is identified depending on geometric and material properties of the urethra. We additionally investigate the PSF's frequency response to determine the relevant frequency band for pressure information on the urinary sphincter. Experimental pressure data are spatially located and processed into high resolution pressure images. Artifacts are successfully removed from data without blurring other details. The pressure distribution on the urethra's outside is reconstructed and compared to the one on the inside. Finally, the pressure images are mapped onto the urethral geometry calculated from inclination and position data to provide an integrated image of pressure distribution, anatomical shape, and location. With its advanced sensing capabilities, the novel microtip catheter collects an unprecedented amount of urethral pressure data. Through sequential signal processing steps, physicians are provided with detailed information on the pressure distribution in and around the urethra. Therefore, HD-UPP overcomes many current limitations of conventional UPP and offers the opportunity to evaluate urethral structures, especially the sphincter, in context of the correct anatomical location. This could enable the development of focal therapy approaches in the treatment of SUI.
SU-F-I-41: Calibration-Free Material Decomposition for Dual-Energy CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, W; Xing, L; Zhang, Q
2016-06-15
Purpose: To eliminate tedious phantom calibration or manually region of interest (ROI) selection as required in dual-energy CT material decomposition, we establish a new projection-domain material decomposition framework with incorporation of energy spectrum. Methods: Similar to the case of dual-energy CT, the integral of the basis material image in our model is expressed as a linear combination of basis functions, which are the polynomials of high- and low-energy raw projection data. To yield the unknown coefficients of the linear combination, the proposed algorithm minimizes the quadratic error between the high- and low-energy raw projection data and the projection calculated usingmore » material images. We evaluate the algorithm with an iodine concentration numerical phantom at different dose and iodine concentration levels. The x-ray energy spectra of the high and low energy are estimated using an indirect transmission method. The derived monochromatic images are compared with the high- and low-energy CT images to demonstrate beam hardening artifacts reduction. Quantitative results were measured and compared to the true values. Results: The differences between the true density value used for simulation and that were obtained from the monochromatic images, are 1.8%, 1.3%, 2.3%, and 2.9% for the dose levels from standard dose to 1/8 dose, and are 0.4%, 0.7%, 1.5%, and 1.8% for the four iodine concentration levels from 6 mg/mL to 24 mg/mL. For all of the cases, beam hardening artifacts, especially streaks shown between dense inserts, are almost completely removed in the monochromatic images. Conclusion: The proposed algorithm provides an effective way to yield material images and artifacts-free monochromatic images at different dose levels without the need for phantom calibration or ROI selection. Furthermore, the approach also yields accurate results when the concentration of the iodine concentrate insert is very low, suggesting the algorithm is robust with respect to the low-contrast scenario.« less
A Novel Method for Characterizing Beam Hardening Artifacts in Cone-beam Computed Tomographic Images.
Fox, Aaron; Basrani, Bettina; Kishen, Anil; Lam, Ernest W N
2018-05-01
The beam hardening (BH) artifact produced by root filling materials in cone-beam computed tomographic (CBCT) images is influenced by their radiologic K absorption edge values. The purpose of this study was to describe a novel technique to characterize BH artifacts in CBCT images produced by 3 root canal filling materials and to evaluate the effects of a zirconium (Zr)-based root filling material with a lower K edge (17.99 keV) on the production of BH artifacts. The palatal root canals of 3 phantom model teeth were prepared and root filled with gutta-percha (GP), a Zr root filling material, and calcium hydroxide paste. Each phantom tooth was individually imaged using the CS 9000 CBCT unit (Carestream, Atlanta, GA). The "light" and "dark" components of the BH artifacts were quantified separately using ImageJ software (National Institutes of Health, Bethesda, MD) in 3 regions of the root. Mixed-design analysis of variance was used to evaluate differences in the artifact area for the light and dark elements of the BH artifacts. A statistically significant difference in the area of the dark portion of the BH artifact was found between all fill materials and in all regions of the phantom tooth root (P < .05). GP generated a significantly greater dark but not light artifact area compared with Zr (P < .05). Moreover, statistically significant differences between the areas of both the light and dark artifacts were observed within all regions of the tooth root, with the greatest artifact being generated in the coronal third of the root (P < .001). Root canal filling materials with lower K edge material properties reduce BH artifacts along the entire length of the root canal and reduce the contribution of the dark artifact. Copyright © 2018 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
The Effects of Filter Cutoff Frequency on Musculoskeletal Simulations of High-Impact Movements.
Tomescu, Sebastian; Bakker, Ryan; Beach, Tyson A C; Chandrashekar, Naveen
2018-02-12
Estimation of muscle forces through musculoskeletal simulation is important in understanding human movement and injury. Unmatched filter frequencies used to low-pass filter marker and force platform data can create artifacts during inverse dynamics analysis, but their effects on muscle force calculations are unknown. The objective of this study was to determine the effects of filter cutoff frequency on simulation parameters and magnitudes of lower extremity muscle and resultant joint contact forces during a high-impact maneuver. Eight participants performed a single leg jump-landing. Kinematics were captured with a 3D motion capture system and ground reaction forces were recorded with a force platform. The marker and force platform data were filtered using two matched filter frequencies (10-10Hz, 15-15Hz) and two unmatched frequencies (10-50Hz, 15-50Hz). Musculoskeletal simulations using Computed Muscle Control were performed in OpenSim. The results revealed significantly higher peak quadriceps (13%), hamstrings (48%), and gastrocnemius forces (69%) in the unmatched (10-50Hz, 15-50Hz) conditions than in the matched (10-10Hz, 15-15Hz) conditions (p<0.05). Resultant joint contact forces and reserve (non-physiologic) moments were similarly larger in the unmatched filter categories (p<0.05). This study demonstrated that artifacts created from filtering with unmatched filter cutoffs result in altered muscle forces and dynamics which are not physiologic.
Li, Xiaoqian; Yow, W Quin
2018-09-01
Prior work has shown that young children trust single accurate and inaccurate individuals to a similar extent in their endorsement of novel information. However, it remains unknown to what extent children trust a credible or noncredible individual when given information that is pitted against their own beliefs. The current study examined whether children, when given unexpected testimony that contradicted their initial beliefs but was not completely unbelievable, would selectively revise their beliefs depending on the informant's past history of accuracy. The participants (3- and 4-year-olds; N = 100) were familiarized with an informant who labeled a series of common objects either accurately or inaccurately. Following that, all children saw a picture of an ambiguous hybrid artifact that consisted of features of two typical common artifacts and were asked to identify the hybrid object with their own label. Subsequently, children watched the previously accurate or inaccurate informant give the same hybrid object a different but plausible label. Children expressed a greater tendency to override their initial judgments and endorse the unexpected testimony from a previously accurate informant than from someone who had consistently made naming errors. The findings provide novel understandings of the circumstances under which 3- and 4-year-old preschoolers may or may not rely on the informant's prior reliability in their selective learning. Copyright © 2018 Elsevier Inc. All rights reserved.
Nash, David J; Coulson, Sheila; Staurset, Sigrid; Ullyott, J Stewart; Babutsi, Mosarwa; Hopkinson, Laurence; Smith, Martin P
2013-04-01
Lithic artifacts from the African Middle Stone Age (MSA) offer an avenue to explore a range of human behaviors, including mobility, raw material acquisition, trade and exchange. However, to date, in southern Africa it has not been possible to provenance the locations from which commonly used stone materials were acquired prior to transport to archaeological sites. Here we present results of the first investigation to geochemically fingerprint silcrete, a material widely used for tool manufacture across the subcontinent. The study focuses on the provenancing of silcrete artifacts from the MSA of White Paintings Shelter (WPS), Tsodilo Hills, in the Kalahari Desert of northwest Botswana. Our results suggest that: (i) despite having access to local quartz and quartzite at Tsodilo Hills, MSA peoples chose to transport silcrete over 220 km to WPS from sites south of the Okavango Delta; (ii) these sites were preferred to silcrete sources much closer to Tsodilo Hills; (iii) the same source areas were repeatedly used for silcrete supply throughout the 3 m MSA sequence; (iv) during periods of colder, wetter climate, silcrete may have been sourced from unknown, more distant, sites. Our results offer a new provenancing approach for exploring prehistoric behavior at other sites where silcrete is present in the archaeological record. Copyright © 2013 Elsevier Ltd. All rights reserved.
Artifacts as Authoritative Actors in Educational Reform
ERIC Educational Resources Information Center
März, Virginie; Kelchtermans, Geert; Vermeir, Karen
2017-01-01
Educational reforms are often translated in and implemented through artifacts. Although research has frequently treated artifacts as merely functional, more recent work acknowledges the complex relationship between material artifacts and human/organizational behavior. This article aims at disentangling this relationship in order to deepen our…
Kirberger, R M; Roos, C J
1995-06-01
Radiographic artifacts commonly occur, particularly with hand processing. The artifacts may originate between the X-ray tube and the cassette as extraneous material on the patient or contamination of positioning aids, or result from debris within the cassette, or damage to, or staining of the screens. These artifacts are white to grey, may have a constant or different position on follow-up radiographs, and their size and shape are reflective of the inciting cause. A number of artifacts may occur in the darkroom during handling, developing, fixing and drying of the film. White to shiny artifacts are caused by the contamination of films with fixer, inability of developer to reach parts of the film or loss of emulsion from the developed film. Black artifacts result from improper handling or storage of films, resulting in exposure to light, or from pressure marks or static electricity discharges. Dropped levels of hand-processing chemicals may result in a variety of tide-marks on films. Most radiographic artifacts can be prevented by proper storage and handling of films and by optimal darkroom technique.
A Novel Stimulus Artifact Removal Technique for High-Rate Electrical Stimulation
Heffer, Leon F; Fallon, James B
2008-01-01
Electrical stimulus artifact corrupting electrophysiological recordings often make the subsequent analysis of the underlying neural response difficult. This is particularly evident when investigating short-latency neural activity in response to high-rate electrical stimulation. We developed and evaluated an off-line technique for the removal of stimulus artifact from electrophysiological recordings. Pulsatile electrical stimulation was presented at rates of up to 5000 pulses/s during extracellular recordings of guinea pig auditory nerve fibers. Stimulus artifact was removed by replacing the sample points at each stimulus artifact event with values interpolated along a straight line, computed from neighbouring sample points. This technique required only that artifact events be identifiable and that the artifact duration remained less than both the inter-stimulus interval and the time course of the action potential. We have demonstrated that this computationally efficient sample-and-interpolate technique removes the stimulus artifact with minimal distortion of the action potential waveform. We suggest that this technique may have potential applications in a range of electrophysiological recording systems. PMID:18339428
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemp, B.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kofler, J.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooley, R.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
NASA Astrophysics Data System (ADS)
Young, D.; Willett, F.; Memberg, W. D.; Murphy, B.; Walter, B.; Sweet, J.; Miller, J.; Hochberg, L. R.; Kirsch, R. F.; Ajiboye, A. B.
2018-04-01
Objective. Functional electrical stimulation (FES) is a promising technology for restoring movement to paralyzed limbs. Intracortical brain-computer interfaces (iBCIs) have enabled intuitive control over virtual and robotic movements, and more recently over upper extremity FES neuroprostheses. However, electrical stimulation of muscles creates artifacts in intracortical microelectrode recordings that could degrade iBCI performance. Here, we investigate methods for reducing the cortically recorded artifacts that result from peripheral electrical stimulation. Approach. One participant in the BrainGate2 pilot clinical trial had two intracortical microelectrode arrays placed in the motor cortex, and thirty-six stimulating intramuscular electrodes placed in the muscles of the contralateral limb. We characterized intracortically recorded electrical artifacts during both intramuscular and surface stimulation. We compared the performance of three artifact reduction methods: blanking, common average reference (CAR) and linear regression reference (LRR), which creates channel-specific reference signals, composed of weighted sums of other channels. Main results. Electrical artifacts resulting from surface stimulation were 175 × larger than baseline neural recordings (which were 110 µV peak-to-peak), while intramuscular stimulation artifacts were only 4 × larger. The artifact waveforms were highly consistent across electrodes within each array. Application of LRR reduced artifact magnitudes to less than 10 µV and largely preserved the original neural feature values used for decoding. Unmitigated stimulation artifacts decreased iBCI decoding performance, but performance was almost completely recovered using LRR, which outperformed CAR and blanking and extracted useful neural information during stimulation artifact periods. Significance. The LRR method was effective at reducing electrical artifacts resulting from both intramuscular and surface FES, and almost completely restored iBCI decoding performance (>90% recovery for surface stimulation and full recovery for intramuscular stimulation). The results demonstrate that FES-induced artifacts can be easily mitigated in FES + iBCI systems by using LRR for artifact reduction, and suggest that the LRR method may also be useful in other noise reduction applications.
Ultrasound artifacts: classification, applied physics with illustrations, and imaging appearances.
Prabhu, Somnath J; Kanal, Kalpana; Bhargava, Puneet; Vaidya, Sandeep; Dighe, Manjiri K
2014-06-01
Ultrasound has become a widely used diagnostic imaging modality in medicine because of its safety and portability. Because of rapid advances in technology, in recent years, sonographic imaging quality has significantly increased. Despite these advances, the potential to encounter artifacts while imaging remains.This article classifies both common and uncommon gray-scale and Doppler ultrasound artifacts into those resulting from physiology and those caused by hardware. A brief applied-physics explanation for each artifact is listed along with an illustrated diagram. The imaging appearance of artifacts is presented in case examples, along with strategies to minimize the artifacts in real time or use them for clinical advantage where applicable.
NASA Astrophysics Data System (ADS)
Bai, Yang; Wan, Xiaohong; Zeng, Ke; Ni, Yinmei; Qiu, Lirong; Li, Xiaoli
2016-12-01
Objective. When prefrontal-transcranial magnetic stimulation (p-TMS) performed, it may evoke hybrid artifact mixed with muscle activity and blink activity in EEG recordings. Reducing this kind of hybrid artifact challenges the traditional preprocessing methods. We aim to explore method for the p-TMS evoked hybrid artifact removal. Approach. We propose a novel method used as independent component analysis (ICA) post processing to reduce the p-TMS evoked hybrid artifact. Ensemble empirical mode decomposition (EEMD) was used to decompose signal into multi-components, then the components were separated with artifact reduced by blind source separation (BSS) method. Three standard BSS methods, ICA, independent vector analysis, and canonical correlation analysis (CCA) were tested. Main results. Synthetic results showed that EEMD-CCA outperformed others as ICA post processing step in hybrid artifacts reduction. Its superiority was clearer when signal to noise ratio (SNR) was lower. In application to real experiment, SNR can be significantly increased and the p-TMS evoked potential could be recovered from hybrid artifact contaminated signal. Our proposed method can effectively reduce the p-TMS evoked hybrid artifacts. Significance. Our proposed method may facilitate future prefrontal TMS-EEG researches.
Reducing Artifacts in TMS-Evoked EEG
NASA Astrophysics Data System (ADS)
Fuertes, Juan José; Travieso, Carlos M.; Álvarez, A.; Ferrer, M. A.; Alonso, J. B.
Transcranial magnetic stimulation induces weak currents within the cranium to activate neuronal firing and its response is recorded using electroencephalography in order to study the brain directly. However, different artifacts contaminate the results. The goal of this study is to process these artifacts and reduce them digitally. Electromagnetic, blink and auditory artifacts are considered, and Signal-Space Projection, Independent Component Analysis and Wiener Filtering methods are used to reduce them. These last two produce a successful solution for electromagnetic artifacts. Regarding the other artifacts, processed with Signal-Space Projection, the method reduces the artifact but modifies the signal as well. Nonetheless, they are modified in an exactly known way and the vector used for the projection is conserved to be taken into account when analyzing the resulting signals. A system which combines the proposed methods would improve the quality of the information presented to physicians.
Metal artifact reduction in MRI-based cervical cancer intracavitary brachytherapy
NASA Astrophysics Data System (ADS)
Rao, Yuan James; Zoberi, Jacqueline E.; Kadbi, Mo; Grigsby, Perry W.; Cammin, Jochen; Mackey, Stacie L.; Garcia-Ramirez, Jose; Goddu, S. Murty; Schwarz, Julie K.; Gach, H. Michael
2017-04-01
Magnetic resonance imaging (MRI) plays an increasingly important role in brachytherapy planning for cervical cancer. Yet, metal tandem, ovoid intracavitary applicators, and fiducial markers used in brachytherapy cause magnetic susceptibility artifacts in standard MRI. These artifacts may impact the accuracy of brachytherapy treatment and the evaluation of tumor response by misrepresenting the size and location of the metal implant, and distorting the surrounding anatomy and tissue. Metal artifact reduction sequences (MARS) with high bandwidth RF selective excitations and turbo spin-echo readouts were developed for MRI of orthopedic implants. In this study, metal artifact reduction was applied to brachytherapy of cervical cancer using the orthopedic metal artifact reduction (O-MAR) sequence. O-MAR combined MARS features with view angle tilting and slice encoding for metal artifact correction (SEMAC) to minimize in-plane and through-plane susceptibility artifacts. O-MAR improved visualization of the tandem tip on T2 and proton density weighted (PDW) imaging in phantoms and accurately represented the diameter of the tandem. In a pilot group of cervical cancer patients (N = 7), O-MAR significantly minimized the blooming artifact at the tip of the tandem in PDW MRI. There was no significant difference observed in artifact reduction between the weak (5 kHz, 7 z-phase encodes) and medium (10 kHz, 13 z-phase encodes) SEMAC settings. However, the weak setting allowed a significantly shorter acquisition time than the medium setting. O-MAR also reduced susceptibility artifacts associated with metal fiducial markers so that they appeared on MRI at their true dimensions.
De Crop, An; Casselman, Jan; Van Hoof, Tom; Dierens, Melissa; Vereecke, Elke; Bossu, Nicolas; Pamplona, Jaime; D'Herde, Katharina; Thierens, Hubert; Bacher, Klaus
2015-08-01
Metal artifacts may negatively affect radiologic assessment in the oral cavity. The aim of this study was to evaluate different metal artifact reduction techniques for metal artifacts induced by dental hardware in CT scans of the oral cavity. Clinical image quality was assessed using a Thiel-embalmed cadaver. A Catphan phantom and a polymethylmethacrylate (PMMA) phantom were used to evaluate physical-technical image quality parameters such as artifact area, artifact index (AI), and contrast detail (IQFinv). Metal cylinders were inserted in each phantom to create metal artifacts. CT images of both phantoms and the Thiel-embalmed cadaver were acquired on a multislice CT scanner using 80, 100, 120, and 140 kVp; model-based iterative reconstruction (Veo); and synthesized monochromatic keV images with and without metal artifact reduction software (MARs). Four radiologists assessed the clinical image quality, using an image criteria score (ICS). Significant influence of increasing kVp and the use of Veo was found on clinical image quality (p = 0.007 and p = 0.014, respectively). Application of MARs resulted in a smaller artifact area (p < 0.05). However, MARs reconstructed images resulted in lower ICS. Of all investigated techniques, Veo shows to be most promising, with a significant improvement of both the clinical and physical-technical image quality without adversely affecting contrast detail. MARs reconstruction in CT images of the oral cavity to reduce dental hardware metallic artifacts is not sufficient and may even adversely influence the image quality.
Automatic identification of artifacts in electrodermal activity data.
Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind
2015-01-01
Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.
Yoo, Wook Jae; Shin, Sang Hun; Jeon, Dayeong; Hong, Seunghan; Sim, Hyeok In; Kim, Seon Geun; Jang, Kyoung Won; Cho, Seunghyun; Youn, Won Sik; Lee, Bongsoo
2014-01-01
A miniature fiber-optic dosimeter (FOD) system was fabricated using a plastic scintillating fiber, a plastic optical fiber, and a multi-pixel photon counter to measure real-time entrance surface dose (ESD) during radiation diagnosis. Under varying exposure parameters of a digital radiography (DR) system, we measured the scintillating light related to the ESD using the sensing probe of the FOD, which was placed at the center of the beam field on an anthropomorphic thorax phantom. Also, we obtained DR images using a flat panel detector of the DR system to evaluate the effects of the dosimeter on image artifacts during posteroanterior (PA) chest radiography. From the experimental results, the scintillation output signals of the FOD were similar to the ESDs including backscatter simultaneously obtained using a semiconductor dosimeter. We demonstrated that the proposed miniature FOD can be used to measure real-time ESDs with minimization of DR image artifacts in the X-ray energy range of diagnostic radiology. PMID:24694678
Yoo, Wook Jae; Shin, Sang Hun; Jeon, Dayeong; Hong, Seunghan; Sim, Hyeok In; Kim, Seon Geun; Jang, Kyoung Won; Cho, Seunghyun; Youn, Won Sik; Lee, Bongsoo
2014-04-01
A miniature fiber-optic dosimeter (FOD) system was fabricated using a plastic scintillating fiber, a plastic optical fiber, and a multi-pixel photon counter to measure real-time entrance surface dose (ESD) during radiation diagnosis. Under varying exposure parameters of a digital radiography (DR) system, we measured the scintillating light related to the ESD using the sensing probe of the FOD, which was placed at the center of the beam field on an anthropomorphic thorax phantom. Also, we obtained DR images using a flat panel detector of the DR system to evaluate the effects of the dosimeter on image artifacts during posteroanterior (PA) chest radiography. From the experimental results, the scintillation output signals of the FOD were similar to the ESDs including backscatter simultaneously obtained using a semiconductor dosimeter. We demonstrated that the proposed miniature FOD can be used to measure real-time ESDs with minimization of DR image artifacts in the X-ray energy range of diagnostic radiology.
Blind technique using blocking artifacts and entropy of histograms for image tampering detection
NASA Astrophysics Data System (ADS)
Manu, V. T.; Mehtre, B. M.
2017-06-01
The tremendous technological advancements in recent times has enabled people to create, edit and circulate images easily than ever before. As a result of this, ensuring the integrity and authenticity of the images has become challenging. Malicious editing of images to deceive the viewer is referred to as image tampering. A widely used image tampering technique is image splicing or compositing, in which regions from different images are copied and pasted. In this paper, we propose a tamper detection method utilizing the blocking and blur artifacts which are the footprints of splicing. The classification of images as tampered or not, is done based on the standard deviations of the entropy histograms and block discrete cosine transformations. We can detect the exact boundaries of the tampered area in the image, if the image is classified as tampered. Experimental results on publicly available image tampering datasets show that the proposed method outperforms the existing methods in terms of accuracy.
First Epigravettian Ceramic Figurines from Europe (Vela Spila, Croatia)
Farbstein, Rebecca; Radić, Dinko; Brajković, Dejana; Miracle, Preston T.
2012-01-01
Recent finds of 36 ceramic artifacts from the archaeological site of Vela Spila, Croatia, offer the first evidence of ceramic figurative art in late Upper Palaeolithic Europe, c. 17,500–15,000 years before present (BP). The size and diversity of this artistic ceramic assemblage indicate the emergence of a social tradition, rather than more ephemeral experimentation with a new material. Vela Spila ceramics offer compelling technological and stylistic comparisons with the only other evidence of a developed Palaeolithic ceramic tradition found at the sites of Pavlov I and Dolní Věstonice I, in the Czech Republic, c. 31,000–27,000 cal BP. Because of the 10,000-year gap between the two assemblages, the Vela Spila ceramics are interpreted as evidence of an independent invention of this technology. Consequently, these artifacts provide evidence of a new social context in which ceramics developed and were used to make art in the Upper Palaeolithic. PMID:22848495
Fast 3D shape measurements with reduced motion artifacts
NASA Astrophysics Data System (ADS)
Feng, Shijie; Zuo, Chao; Chen, Qian; Gu, Guohua
2017-10-01
Fringe projection is an extensively used technique for high speed three-dimensional (3D) measurements of dynamic objects. However, the motion often leads to artifacts in reconstructions due to the sequential recording of the set of patterns. In order to reduce the adverse impact of the movement, we present a novel high speed 3D scanning technique combining the fringe projection and stereo. Firstly, promising measuring speed is achieved by modifying the traditional aperiodic sinusoidal patterns so that the fringe images can be cast at kilohertz with the widely used defocusing strategy. Next, a temporal intensity tracing algorithm is developed to further alleviate the influence of motion by accurately tracing the ideal intensity for stereo matching. Then, a combined cost measure is suggested to robustly estimate the cost for each pixel. In comparison with the traditional method where the effect of motion is not considered, experimental results show that the reconstruction accuracy for dynamic objects can be improved by an order of magnitude with the proposed method.
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
Bijttebier, Sebastiaan; D'Hondt, Els; Noten, Bart; Hermans, Nina; Apers, Sandra; Voorspoels, Stefan
2014-11-15
Alkaline saponification is often used to remove interfering chlorophylls and lipids during carotenoids analysis. However, saponification also hydrolyses esterified carotenoids and is known to induce artifacts. To avoid carotenoid artifact formation during saponification, Larsen and Christensen (2005) developed a gentler and simpler analytical clean-up procedure involving the use of a strong basic resin (Ambersep 900 OH). They hypothesised a saponification mechanism based on their Liquid Chromatography-Photodiode Array (LC-PDA) data. In the present study, we show with LC-PDA-accurate mass-Mass Spectrometry that the main chlorophyll removal mechanism is not based on saponification, apolar adsorption or anion exchange, but most probably an adsorption mechanism caused by H-bonds and dipole-dipole interactions. We showed experimentally that esterified carotenoids and glycerolipids were not removed, indicating a much more selective mechanism than initially hypothesised. This opens new research opportunities towards a much wider scope of applications (e.g. the refinement of oils rich in phytochemical content). Copyright © 2014 Elsevier Ltd. All rights reserved.
The Information Architecture of E-Commerce: An Experimental Study on User Performance and Preference
NASA Astrophysics Data System (ADS)
Wan Mohd, Wan Abdul Rahim; Md Noor, Nor Laila; Mehad, Shafie
Too often, designers of e-commerce web sites use models, concepts, guidelines, and designs that focus on the artifacts while ignoring the context in which the artifacts will be used. Furthermore, the link between culture and usability in web site IA phenomenon is still considered as uncharted area, as it lacks much theoretical consideration. In an effort toward addressing the aforementioned issues, our study provides a theoretical and empirical link between cultural and usability through the application of ‘Venustas' (Delight) drawn from the architectural field and Hofstede's cultural dimensions. We use Islamic culture as the case study and report on the experiment to investigate the effect of the IA designs based on the cultural dimensions on e-commerce web sites. The result provides partial empirical support to the theorized link between culture and usability based on the usability measurement on user performance and preference. In addition, practical web site IA cultural design prescriptions are also provided.
The Complex Point Cloud for the Knowledge of the Architectural Heritage. Some Experiences
NASA Astrophysics Data System (ADS)
Aveta, C.; Salvatori, M.; Vitelli, G. P.
2017-05-01
The present paper aims to present a series of experiences and experimentations that a group of PhD from the University of Naples Federico II conducted over the past decade. This work has concerned the survey and the graphic restitution of monuments and works of art, finalized to their conservation. The targeted query of complex point cloud acquired by 3D scanners, integrated with photo sensors and thermal imaging, has allowed to explore new possibilities of investigation. In particular, we will present the scientific results of the experiments carried out on some important historical artifacts with distinct morphological and typological characteristics. According to aims and needs that emerged during the connotative process, with the support of archival and iconographic historical research, the laser scanner technology has been used in many different ways. New forms of representation, obtained directly from the point cloud, have been tested for the elaboration of thematic studies for documenting the pathologies and the decay of materials, for correlating visible aspects with invisible aspects of the artifact.
MapX: 2D XRF for Planetary Exploration - Image Formation and Optic Characterization
Sarrazin, P.; Blake, D.; Gailhanou, M.; ...
2018-04-01
Map-X is a planetary instrument concept for 2D X-Ray Fluorescence (XRF) spectroscopy. The instrument is placed directly on the surface of an object and held in a fixed position during the measurement. The formation of XRF images on the CCD detector relies on a multichannel optic configured for 1:1 imaging and can be analyzed through the point spread function (PSF) of the optic. The PSF can be directly measured using a micron-sized monochromatic X-ray source in place of the sample. Such PSF measurements were carried out at the Stanford Synchrotron and are compared with ray tracing simulations. It is shownmore » that artifacts are introduced by the periodicity of the PSF at the channel scale and the proximity of the CCD pixel size and the optic channel size. A strategy of sub-channel random moves was used to cancel out these artifacts and provide a clean experimental PSF directly usable for XRF image deconvolution.« less
Four-thousand-year-old gold artifacts from the Lake Titicaca basin, southern Peru
Aldenderfer, Mark; Craig, Nathan M.; Speakman, Robert J.; Popelka-Filcoff, Rachel
2008-01-01
Artifacts of cold-hammered native gold have been discovered in a secure and undisturbed Terminal Archaic burial context at Jiskairumoko, a multicomponent Late Archaic–Early Formative period site in the southwestern Lake Titicaca basin, Peru. The burial dates to 3776 to 3690 carbon-14 years before the present (2155 to 1936 calendar years B.C.), making this the earliest worked gold recovered to date not only from the Andes, but from the Americas as well. This discovery lends support to the hypothesis that the earliest metalworking in the Andes was experimentation with native gold. The presence of gold in a society of low-level food producers undergoing social and economic transformations coincident with the onset of sedentary life is an indicator of possible early social inequality and aggrandizing behavior and further shows that hereditary elites and a societal capacity to create significant agricultural surpluses are not requisite for the emergence of metalworking traditions. PMID:18378903
MapX: 2D XRF for Planetary Exploration - Image Formation and Optic Characterization
NASA Astrophysics Data System (ADS)
Sarrazin, P.; Blake, D.; Gailhanou, M.; Marchis, F.; Chalumeau, C.; Webb, S.; Walter, P.; Schyns, E.; Thompson, K.; Bristow, T.
2018-04-01
Map-X is a planetary instrument concept for 2D X-Ray Fluorescence (XRF) spectroscopy. The instrument is placed directly on the surface of an object and held in a fixed position during the measurement. The formation of XRF images on the CCD detector relies on a multichannel optic configured for 1:1 imaging and can be analyzed through the point spread function (PSF) of the optic. The PSF can be directly measured using a micron-sized monochromatic X-ray source in place of the sample. Such PSF measurements were carried out at the Stanford Synchrotron and are compared with ray tracing simulations. It is shown that artifacts are introduced by the periodicity of the PSF at the channel scale and the proximity of the CCD pixel size and the optic channel size. A strategy of sub-channel random moves was used to cancel out these artifacts and provide a clean experimental PSF directly usable for XRF image deconvolution.
MapX: 2D XRF for Planetary Exploration - Image Formation and Optic Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarrazin, P.; Blake, D.; Gailhanou, M.
Map-X is a planetary instrument concept for 2D X-Ray Fluorescence (XRF) spectroscopy. The instrument is placed directly on the surface of an object and held in a fixed position during the measurement. The formation of XRF images on the CCD detector relies on a multichannel optic configured for 1:1 imaging and can be analyzed through the point spread function (PSF) of the optic. The PSF can be directly measured using a micron-sized monochromatic X-ray source in place of the sample. Such PSF measurements were carried out at the Stanford Synchrotron and are compared with ray tracing simulations. It is shownmore » that artifacts are introduced by the periodicity of the PSF at the channel scale and the proximity of the CCD pixel size and the optic channel size. A strategy of sub-channel random moves was used to cancel out these artifacts and provide a clean experimental PSF directly usable for XRF image deconvolution.« less
Improved fuzzy clustering algorithms in segmentation of DC-enhanced breast MRI.
Kannan, S R; Ramathilagam, S; Devi, Pandiyarajan; Sathya, A
2012-02-01
Segmentation of medical images is a difficult and challenging problem due to poor image contrast and artifacts that result in missing or diffuse organ/tissue boundaries. Many researchers have applied various techniques however fuzzy c-means (FCM) based algorithms is more effective compared to other methods. The objective of this work is to develop some robust fuzzy clustering segmentation systems for effective segmentation of DCE - breast MRI. This paper obtains the robust fuzzy clustering algorithms by incorporating kernel methods, penalty terms, tolerance of the neighborhood attraction, additional entropy term and fuzzy parameters. The initial centers are obtained using initialization algorithm to reduce the computation complexity and running time of proposed algorithms. Experimental works on breast images show that the proposed algorithms are effective to improve the similarity measurement, to handle large amount of noise, to have better results in dealing the data corrupted by noise, and other artifacts. The clustering results of proposed methods are validated using Silhouette Method.
ADJUST: An automatic EEG artifact detector based on the joint use of spatial and temporal features.
Mognon, Andrea; Jovicich, Jorge; Bruzzone, Lorenzo; Buiatti, Marco
2011-02-01
A successful method for removing artifacts from electroencephalogram (EEG) recordings is Independent Component Analysis (ICA), but its implementation remains largely user-dependent. Here, we propose a completely automatic algorithm (ADJUST) that identifies artifacted independent components by combining stereotyped artifact-specific spatial and temporal features. Features were optimized to capture blinks, eye movements, and generic discontinuities on a feature selection dataset. Validation on a totally different EEG dataset shows that (1) ADJUST's classification of independent components largely matches a manual one by experts (agreement on 95.2% of the data variance), and (2) Removal of the artifacted components detected by ADJUST leads to neat reconstruction of visual and auditory event-related potentials from heavily artifacted data. These results demonstrate that ADJUST provides a fast, efficient, and automatic way to use ICA for artifact removal. Copyright © 2010 Society for Psychophysiological Research.
Teaching and Learning the Nature of Technical Artifacts
ERIC Educational Resources Information Center
Frederik, Ineke; Sonneveld, Wim; de Vries, Marc J.
2011-01-01
Artifacts are probably our most obvious everyday encounter with technology. Therefore, a good understanding of the nature of technical artifacts is a relevant part of technological literacy. In this article we draw from the philosophy of technology to develop a conceptualization of technical artifacts that can be used for educational purposes.…
NASA Astrophysics Data System (ADS)
Limbacher, J.; Kahn, R. A.
2015-12-01
MISR aerosol optical depth retrievals are fairly robust to small radiometric calibration artifacts, due to the multi-angle observations. However, even small errors in the MISR calibration, especially structured artifacts in the imagery, have a disproportionate effect on the retrieval of aerosol properties from these data. Using MODIS, POLDER-3, AERONET, MAN, and MISR lunar images, we diagnose and correct various calibration and radiometric artifacts found in the MISR radiance (Level 1) data, using empirical image analysis. The calibration artifacts include temporal trends in MISR top-of-atmosphere reflectance at relatively stable desert sites and flat-fielding artifacts detected by comparison to MODIS over bright, low-contrast scenes. The radiometric artifacts include ghosting (as compared to MODIS, POLDER-3, and forward model results) and point-spread function mischaracterization (using the MISR lunar data and MODIS). We minimize the artifacts to the extent possible by parametrically modeling the artifacts and then removing them from the radiance (reflectance) data. Validation is performed using non-training scenes (reflectance comparison), and also by using the MISR Research Aerosol retrieval algorithm results compared to MAN and AERONET.
WE-G-209-01: Digital Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schueler, B.
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
Adjustable shunt valve-induced magnetic resonance imaging artifact: a comparative study.
Toma, Ahmed K; Tarnaris, Andrew; Grieve, Joan P; Watkins, Laurence D; Kitchen, Neil D
2010-07-01
In this paper, the authors' goal was to compare the artifact induced by implanted (in vivo) adjustable shunt valves in spin echo, diffusion weighted (DW), and gradient echo MR imaging pulse sequences. The MR images obtained in 8 patients with proGAV and 6 patients with Strata II adjustable shunt valves were assessed for artifact areas in different planes as well as the total volume for different pulse sequences. Artifacts induced by the Strata II valve were significantly larger than those induced by proGAV valve in spin echo MR imaging pulse sequence (29,761 vs 2450 mm(3) on T2-weighted fast spin echo, p = 0.003) and DW images (100,138 vs 38,955 mm(3), p = 0.025). Artifacts were more marked on DW MR images than on spin echo pulse sequence for both valve types. Adjustable valve-induced artifacts can conceal brain pathology on MR images. This should influence the choice of valve implantation site and the type of valve used. The effect of artifacts on DW images should be highlighted pending the development of less MR imaging artifact-inducing adjustable shunt valves.
Ouyang, Guang; Sommer, Werner; Zhou, Changsong; Aristei, Sabrina; Pinkpank, Thomas; Abdel Rahman, Rasha
2016-11-01
Overt articulation produces strong artifacts in the electroencephalogram and in event-related potentials (ERPs), posing a serious problem for investigating language production with these variables. Here we describe the properties of articulation-related artifacts and propose a novel correction procedure. Experiment 1 co-recorded ERPs and trajectories of the articulators with an electromagnetic articulograph from a single participant. The generalization of the findings from the single participant to standard picture naming was investigated in Experiment 2. Both experiments provided evidence that articulation-induced artifacts may start up to 300 ms or more prior to voice onset or voice key onset-depending on the specific measure; they are highly similar in topography across many different phoneme patterns and differ mainly in their time course and amplitude. ERPs were separated from articulation-related artifacts with residue iteration decomposition (RIDE). After obtaining the artifact-free ERPs, their correlations with the articulatory trajectories dropped near to zero. Artifact removal with independent component analysis was less successful; while correlations with the articulatory movements remained substantial, early components prior to voice onset were attenuated in reconstructed ERPs. These findings offer new insights into the nature of articulation artifacts; together with RIDE as method for artifact removal the present report offers a fresh perspective for ERP studies requiring overt articulation.
van Gorp, Maarten J; van der Graaf, Yolanda; de Mol, Bas A J M; Bakker, Chris J G; Witkamp, Theo D; Ramos, Lino M P; Mali, Willem P T M
2004-03-01
To assess the relationship between heart valve history and susceptibility artifacts at magnetic resonance (MR) imaging of the brain in patients with Björk-Shiley convexoconcave (BSCC) valves. MR images of the brain were obtained in 58 patients with prosthetic heart valves: 20 patients had BSCC valve replacements, and 38 had other types of heart valves. Two experienced neuroradiologists determined the presence or absence of susceptibility artifacts in a consensus reading. Artifacts were defined as characteristic black spots that were visible on T2*-weighted gradient-echo MR images. The statuses of the 20 explanted BSCC valves-specifically, whether they were intact or had an outlet strut fracture (OSF) or a single-leg fracture (SLF)-had been determined earlier. Number of artifacts seen at brain MR imaging was correlated with explanted valve status, and differences were analyzed with nonparametric statistical tests. Significantly more patients with BSCC valves (17 [85%] of 20 patients) than patients with other types of prosthetic valves (18 [47%] of 38 patients) had susceptibility artifacts at MR imaging (P =.005). BSCC valve OSFs were associated with a significantly higher number of artifacts than were intact BSCC valves (P =.01). No significant relationship between SLF and number of artifacts was observed. Susceptibility artifacts at brain MR imaging are not restricted to patients with BSCC valves. These artifacts can be seen on images obtained in patients with various other types of fractured and intact prosthetic heart valves. Copyright RSNA, 2004
Body MR Imaging: Artifacts, k-Space, and Solutions
Seethamraju, Ravi T.; Patel, Pritesh; Hahn, Peter F.; Kirsch, John E.; Guimaraes, Alexander R.
2015-01-01
Body magnetic resonance (MR) imaging is challenging because of the complex interaction of multiple factors, including motion arising from respiration and bowel peristalsis, susceptibility effects secondary to bowel gas, and the need to cover a large field of view. The combination of these factors makes body MR imaging more prone to artifacts, compared with imaging of other anatomic regions. Understanding the basic MR physics underlying artifacts is crucial to recognizing the trade-offs involved in mitigating artifacts and improving image quality. Artifacts can be classified into three main groups: (a) artifacts related to magnetic field imperfections, including the static magnetic field, the radiofrequency (RF) field, and gradient fields; (b) artifacts related to motion; and (c) artifacts arising from methods used to sample the MR signal. Static magnetic field homogeneity is essential for many MR techniques, such as fat saturation and balanced steady-state free precession. Susceptibility effects become more pronounced at higher field strengths and can be ameliorated by using spin-echo sequences when possible, increasing the receiver bandwidth, and aligning the phase-encoding gradient with the strongest susceptibility gradients, among other strategies. Nonuniformities in the RF transmit field, including dielectric effects, can be minimized by applying dielectric pads or imaging at lower field strength. Motion artifacts can be overcome through respiratory synchronization, alternative k-space sampling schemes, and parallel imaging. Aliasing and truncation artifacts derive from limitations in digital sampling of the MR signal and can be rectified by adjusting the sampling parameters. Understanding the causes of artifacts and their possible solutions will enable practitioners of body MR imaging to meet the challenges of novel pulse sequence design, parallel imaging, and increasing field strength. ©RSNA, 2015 PMID:26207581
d'Entremont, Agnes G; Kolind, Shannon H; Mädler, Burkhard; Wilson, David R; MacKay, Alexander L
2014-03-01
To evaluate the effect of metal artifact reduction techniques on dGEMRIC T(1) calculation with surgical hardware present. We examined the effect of stainless-steel and titanium hardware on dGEMRIC T(1) maps. We tested two strategies to reduce metal artifact in dGEMRIC: (1) saturation recovery (SR) instead of inversion recovery (IR) and (2) applying the metal artifact reduction sequence (MARS), in a gadolinium-doped agarose gel phantom and in vivo with titanium hardware. T(1) maps were obtained using custom curve-fitting software and phantom ROIs were defined to compare conditions (metal, MARS, IR, SR). A large area of artifact appeared in phantom IR images with metal when T(I) ≤ 700 ms. IR maps with metal had additional artifact both in vivo and in the phantom (shifted null points, increased mean T(1) (+151 % IR ROI(artifact)) and decreased mean inversion efficiency (f; 0.45 ROI(artifact), versus 2 for perfect inversion)) compared to the SR maps (ROI(artifact): +13 % T(1) SR, 0.95 versus 1 for perfect excitation), however, SR produced noisier T(1) maps than IR (phantom SNR: 118 SR, 212 IR). MARS subtly reduced the extent of artifact in the phantom (IR and SR). dGEMRIC measurement in the presence of surgical hardware at 3T is possible with appropriately applied strategies. Measurements may work best in the presence of titanium and are severely limited with stainless steel. For regions near hardware where IR produces large artifacts making dGEMRIC analysis impossible, SR-MARS may allow dGEMRIC measurements. The position and size of the IR artifact is variable, and must be assessed for each implant/imaging set-up.
A Robust Post-Processing Workflow for Datasets with Motion Artifacts in Diffusion Kurtosis Imaging
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X.; Wan, Mingxi
2014-01-01
Purpose The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). Materials and methods The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). Results The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). Conclusion The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements. PMID:24727862
Prior-based artifact correction (PBAC) in computed tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heußer, Thorsten, E-mail: thorsten.heusser@dkfz-heidelberg.de; Brehm, Marcus; Ritschl, Ludwig
2014-02-15
Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form ofmore » a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data.« less
Kalénine, Solène; Buxbaum, Laurel J.
2016-01-01
Converging evidence supports the existence of functionally and neuroanatomically distinct taxonomic (similarity-based; e.g., hammer-screwdriver) and thematic (event-based; e.g., hammer-nail) semantic systems. Processing of thematic relations between objects has been shown to selectively recruit the left posterior temporoparietal cortex. Similar posterior regions have been also been shown to be critical for knowledge of relationships between actions and manipulable human-made objects (artifacts). Based on the hypothesis that thematic relationships for artifacts are based, at least in part, on action relationships, we assessed the prediction that the same regions of the left posterior temporoparietal cortex would be critical for conceptual processing of artifact-related actions and thematic relations for artifacts. To test this hypothesis, we evaluated processing of taxonomic and thematic relations for artifact and natural objects as well as artifact action knowledge (gesture recognition) abilities in a large sample of 48 stroke patients with a range of lesion foci in the left hemisphere. Like control participants, patients identified thematic relations faster than taxonomic relations for artifacts, whereas they identified taxonomic relations faster than thematic relations for natural objects. Moreover, response times for identifying thematic relations for artifacts selectively predicted performance in gesture recognition. Whole brain Voxel Based Lesion-Symptom Mapping (VLSM) analyses and Region of Interest (ROI) regression analyses further demonstrated that lesions to the left posterior temporal cortex, overlapping with LTO and visual motion area hMT+, were associated both with relatively slower response times in identifying thematic relations for artifacts and poorer artifact action knowledge in patients. These findings provide novel insights into the functional role of left posterior temporal cortex in thematic knowledge, and suggest that the close association between thematic relations for artifacts and action representations may reflect their common dependence on visual motion and manipulation information. PMID:27389801
A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi
2014-01-01
The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements.
Dengg, S; Kneissl, S
2013-01-01
Ferromagnetic material in microchips, used for animal identification, causes local signal increase, signal void or distortion (susceptibility artifact) on MR images. To measure the impact of microchip geometry on the artifact's size, an MRI phantom study was performed. Microchips of the labels Datamars®, Euro-I.D.® and Planet-ID® (n = 15) were placed consecutively in a phantom and examined with respect to the ASTM Standard Test Method F2119-07 using spin echo (TR 500 ms, TE 20 ms), gradient echo (TR 300 ms, TE 15 ms, flip angel 30°) and otherwise constant imaging parameters (slice thickness 3 mm, field of view 250 x 250 mm, acquisition matrix 256 x 256 pixel, bandwidth 32 kHz) at 1.5 Tesla. Image acquisition was undertaken with a microchip positioned in the x- and z-direction and in each case with a phase-encoding direction in the y- and z-direction. The artifact size was determined with a) a measurement according to the test method F2119-07 using a homogeneous point operation, b) signal intensity measurement according to Matsuura et al. and c) pixel counts in the artifact according to Port and Pomper. There was a significant difference in artifact size between the three microchips tested (Wilcoxon p = 0.032). A two- to three-fold increase in microchip volume generated an up to 76% larger artifact, depending on the sequence type, phase-encoding direction and chip position to B0. The smaller the microchip geometry, the less is the susceptibility artifact. Spin echoes (SE) generated smaller artifacts than gradient echoes (GE). In relation to the spatial measurement of the artifact, the switch in phase-encoding direction had less influence on the artifact size in GE- than in SE-sequences. However, the artifact shape and direction of SE-sequences can be changed by altering the phase. The artifact size, caused by the microchip, plays a major clinical role in the evaluation of MRI from the head, shoulder and neck regions.
Plöchl, Michael; Ossandón, José P.; König, Peter
2012-01-01
Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts when those had to base their decision on IC topographies only. Furthermore, a receiver operating characteristic (ROC) analysis demonstrated an optimal balance of false positive and false negative at an area under curve (AUC) of more than 0.99. Removing the automatically detected ICs from the data resulted in removal or substantial suppression of ocular artifacts including microsaccadic spike potentials, while the relevant neural signal remained unaffected. In conclusion the present work aims at a better understanding of individual eye movement artifacts, their interrelations and the respective implications for eye artifact correction. Additionally, the proposed ICA-procedure provides a tool for optimized detection and correction of eye movement-related artifact components. PMID:23087632
ERIC Educational Resources Information Center
Nicholson, Scott
2005-01-01
Archaeologists have used material artifacts found in a physical space to gain an understanding about the people who occupied that space. Likewise, as users wander through a digital library, they leave behind data-based artifacts of their activity in the virtual space. Digital library archaeologists can gather these artifacts and employ inductive…
ERIC Educational Resources Information Center
Banerjee, Konika; Kominsky, Jonathan F.; Fernando, Madhawee; Keil, Frank C.
2015-01-01
Across 3 experiments, we found evidence that information about who owns an artifact influenced 5- to 10-year-old children's and adults' judgments about that artifact's primary function. Children's and adults' use of ownership information was underpinned by their inference that owners are typically familiar with owned artifacts and are therefore…
Sampling and handling artifacts can bias filter-based measurements of particulate organic carbon (OC). Several measurement-based methods for OC artifact reduction and/or estimation are currently used in research-grade field studies. OC frequently is not artifact-corrected in larg...
Evaluating Experimental Artifacts in Hydrothermal Prebiotic Synthesis Experiments
NASA Astrophysics Data System (ADS)
Smirnov, Alexander; Schoonen, Martin A. A.
2003-04-01
Control experiments with ultra pure deionized water were conducted to evaluate the organic contamination in hydrothermal prebiotic experiments. Different combinations of reaction vessel material, sampling tubing and stirring were tested and the amounts of organic contaminants determined. All tested types of polymer tubing were proven to introduce organic contaminants (formate, acetate and propionate ions) into the reacting solution. Stainless steel has a catalytic effect on the decomposition of formate, consistent with earlier work at high temperatures and pressures.
Superpower Crises in a Less Confrontational World: Results of an Experimental Simulation
1990-04-01
interteam communications via RAND’s electronic mail system for gaming purposes. 2 The "pilot runs" were designated Games Three and Four, and are referred... game , the impact of a changing real-world international context-a more benign image of the adversary and a less competitive superpower relationship...information overload they faced. Several believed that these problems were game artifacts (which they attributed to our use of the electronic mail system
Immediate spectral flexibility in singing chiffchaffs during experimental exposure to highway noise.
Verzijden, M N; Ripmeester, E A P; Ohms, V R; Snelderwaard, P; Slabbekoorn, H
2010-08-01
Sound plays an important role in the life of many animals, including many bird species. Typically, male birds sing to defend a territory and to attract mates. Ambient noise may negatively affect the signal efficiency of their songs, which may be critical to reproductive success. Consequently, anthropogenic noise may be detrimental to individual birds and to populations in cities and along highways. Several bird species that are still common in urban areas have been shown to sing at higher frequency at locations where there is more low-frequency traffic noise. Here we show that chiffchaffs along noisy highways also sing with a higher minimum frequency than chiffchaffs nearby at a quiet riverside. Furthermore, through experimental exposure to highway noise we show that these birds are capable of making such adjustments over a very short time scale. The first 10 songs sung during the noise exposure revealed an immediate shift to higher frequencies, with a return to pre-exposure levels in recordings without noise the following day. In a transmission re-recording experiment we tested the impact of a potential measurement artifact by recording playback of the same songs repeatedly under different controlled noise conditions. We found an upward shift in the minimum frequency measurement associated with more noisy recordings of the same song, but this artifact was not of a scale that it could explain the noise-dependent spectral shifts in chiffchaffs.
Li, Zhenyu; Wang, Bin; Liu, Hong
2016-08-30
Satellite capturing with free-floating space robots is still a challenging task due to the non-fixed base and unknown mass property issues. In this paper gyro and eye-in-hand camera data are adopted as an alternative choice for solving this problem. For this improved system, a new modeling approach that reduces the complexity of system control and identification is proposed. With the newly developed model, the space robot is equivalent to a ground-fixed manipulator system. Accordingly, a self-tuning control scheme is applied to handle such a control problem including unknown parameters. To determine the controller parameters, an estimator is designed based on the least-squares technique for identifying the unknown mass properties in real time. The proposed method is tested with a credible 3-dimensional ground verification experimental system, and the experimental results confirm the effectiveness of the proposed control scheme.
Li, Zhenyu; Wang, Bin; Liu, Hong
2016-01-01
Satellite capturing with free-floating space robots is still a challenging task due to the non-fixed base and unknown mass property issues. In this paper gyro and eye-in-hand camera data are adopted as an alternative choice for solving this problem. For this improved system, a new modeling approach that reduces the complexity of system control and identification is proposed. With the newly developed model, the space robot is equivalent to a ground-fixed manipulator system. Accordingly, a self-tuning control scheme is applied to handle such a control problem including unknown parameters. To determine the controller parameters, an estimator is designed based on the least-squares technique for identifying the unknown mass properties in real time. The proposed method is tested with a credible 3-dimensional ground verification experimental system, and the experimental results confirm the effectiveness of the proposed control scheme. PMID:27589748
Structure Elucidation of Unknown Metabolites in Metabolomics by Combined NMR and MS/MS Prediction
Boiteau, Rene M.; Hoyt, David W.; Nicora, Carrie D.; ...
2018-01-17
Here, we introduce a cheminformatics approach that combines highly selective and orthogonal structure elucidation parameters; accurate mass, MS/MS (MS 2), and NMR in a single analysis platform to accurately identify unknown metabolites in untargeted studies. The approach starts with an unknown LC-MS feature, and then combines the experimental MS/MS and NMR information of the unknown to effectively filter the false positive candidate structures based on their predicted MS/MS and NMR spectra. We demonstrate the approach on a model mixture and then we identify an uncatalogued secondary metabolite in Arabidopsis thaliana. The NMR/MS 2 approach is well suited for discovery ofmore » new metabolites in plant extracts, microbes, soils, dissolved organic matter, food extracts, biofuels, and biomedical samples, facilitating the identification of metabolites that are not present in experimental NMR and MS metabolomics databases.« less
Structure Elucidation of Unknown Metabolites in Metabolomics by Combined NMR and MS/MS Prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boiteau, Rene M.; Hoyt, David W.; Nicora, Carrie D.
Here, we introduce a cheminformatics approach that combines highly selective and orthogonal structure elucidation parameters; accurate mass, MS/MS (MS 2), and NMR in a single analysis platform to accurately identify unknown metabolites in untargeted studies. The approach starts with an unknown LC-MS feature, and then combines the experimental MS/MS and NMR information of the unknown to effectively filter the false positive candidate structures based on their predicted MS/MS and NMR spectra. We demonstrate the approach on a model mixture and then we identify an uncatalogued secondary metabolite in Arabidopsis thaliana. The NMR/MS 2 approach is well suited for discovery ofmore » new metabolites in plant extracts, microbes, soils, dissolved organic matter, food extracts, biofuels, and biomedical samples, facilitating the identification of metabolites that are not present in experimental NMR and MS metabolomics databases.« less
Structure Elucidation of Unknown Metabolites in Metabolomics by Combined NMR and MS/MS Prediction
Hoyt, David W.; Nicora, Carrie D.; Kinmonth-Schultz, Hannah A.; Ward, Joy K.
2018-01-01
We introduce a cheminformatics approach that combines highly selective and orthogonal structure elucidation parameters; accurate mass, MS/MS (MS2), and NMR into a single analysis platform to accurately identify unknown metabolites in untargeted studies. The approach starts with an unknown LC-MS feature, and then combines the experimental MS/MS and NMR information of the unknown to effectively filter out the false positive candidate structures based on their predicted MS/MS and NMR spectra. We demonstrate the approach on a model mixture, and then we identify an uncatalogued secondary metabolite in Arabidopsis thaliana. The NMR/MS2 approach is well suited to the discovery of new metabolites in plant extracts, microbes, soils, dissolved organic matter, food extracts, biofuels, and biomedical samples, facilitating the identification of metabolites that are not present in experimental NMR and MS metabolomics databases. PMID:29342073
Madaan, Nitesh; Bao, Jie; Nandasiri, Manjula I.; ...
2015-08-31
The experimental atom probe tomography results from two different specimen orientations (top-down and side-ways) of a high oxygen ion conducting Samaria-doped-ceria/Scandia-stabilized-zirconia multilayer thin film solid oxide fuel cell electrolyte was correlated with level-set method based field evaporation simulations for the same specimen orientations. This experiment-theory correlation explains the dynamic specimen shape evolution and ion trajectory aberrations that can induce density artifacts in final reconstruction leading to inaccurate estimation of interfacial intermixing. This study highlights the need and importance of correlating experimental results with field evaporation simulations when using atom probe tomography for studying oxide heterostructure interfaces.
Jamzad, Amoon; Setarehdan, Seyed Kamaledin
2014-04-01
The twinkling artifact is an undesired phenomenon within color Doppler sonograms that usually appears at the site of internal calcifications. Since the appearance of the twinkling artifact is correlated with the roughness of the calculi, noninvasive roughness estimation of the internal stones may be considered as a potential twinkling artifact application. This article proposes a novel quantitative approach for measurement and analysis of twinkling artifact data for roughness estimation. A phantom was developed with 7 quantified levels of roughness. The Doppler system was initially calibrated by the proposed procedure to facilitate the analysis. A total of 1050 twinkling artifact images were acquired from the phantom, and 32 novel numerical measures were introduced and computed for each image. The measures were then ranked on the basis of roughness quantification ability using different methods. The performance of the proposed twinkling artifact-based surface roughness quantification method was finally investigated for different combinations of features and classifiers. Eleven features were shown to be the most efficient numerical twinkling artifact measures in roughness characterization. The linear classifier outperformed other methods for twinkling artifact classification. The pixel count measures produced better results among the other categories. The sequential selection method showed higher accuracy than other individual rankings. The best roughness recognition average accuracy of 98.33% was obtained by the first 5 principle components and the linear classifier. The proposed twinkling artifact analysis method could recognize the phantom surface roughness with average accuracy of 98.33%. This method may also be applicable for noninvasive calculi characterization in treatment management.
NASA Astrophysics Data System (ADS)
Dong, Xue; Yang, Xiaofeng; Rosenfield, Jonathan; Elder, Eric; Dhabaan, Anees
2017-03-01
X-ray computed tomography (CT) is widely used in radiation therapy treatment planning in recent years. However, metal implants such as dental fillings and hip prostheses can cause severe bright and dark streaking artifacts in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. In this work, a metal artifact reduction method is proposed based on the intrinsic anatomical similarity between neighboring CT slices. Neighboring CT slices from the same patient exhibit similar anatomical features. Exploiting this anatomical similarity, a gamma map is calculated as a weighted summation of relative HU error and distance error for each pixel in an artifact-corrupted CT image relative to a neighboring, artifactfree image. The minimum value in the gamma map for each pixel is used to identify an appropriate pixel from the artifact-free CT slice to replace the corresponding artifact-corrupted pixel. With the proposed method, the mean CT HU error was reduced from 360 HU and 460 HU to 24 HU and 34 HU on head and pelvis CT images, respectively. Dose calculation accuracy also improved, as the dose difference was reduced from greater than 20% to less than 4%. Using 3%/3mm criteria, the gamma analysis failure rate was reduced from 23.25% to 0.02%. An image-based metal artifact reduction method is proposed that replaces corrupted image pixels with pixels from neighboring CT slices free of metal artifacts. This method is shown to be capable of suppressing streaking artifacts, thereby improving HU and dose calculation accuracy.
O'Daniel, Jennifer C; Rosenthal, David I; Garden, Adam S; Barker, Jerry L; Ahamad, Anesa; Ang, K Kian; Asper, Joshua A; Blanco, Angel I; de Crevoisier, Renaud; Holsinger, F Christopher; Patel, Chirag B; Schwartz, David L; Wang, He; Dong, Lei
2007-04-01
To investigate interobserver variability in the delineation of head-and-neck (H&N) anatomic structures on CT images, including the effects of image artifacts and observer experience. Nine observers (7 radiation oncologists, 1 surgeon, and 1 physician assistant) with varying levels of H&N delineation experience independently contoured H&N gross tumor volumes and critical structures on radiation therapy treatment planning CT images alongside reference diagnostic CT images for 4 patients with oropharynx cancer. Image artifacts from dental fillings partially obstructed 3 images. Differences in the structure volumes, center-of-volume positions, and boundary positions (1 SD) were measured. In-house software created three-dimensional overlap distributions, including all observers. The effects of dental artifacts and observer experience on contouring precision were investigated, and the need for contrast media was assessed. In the absence of artifacts, all 9 participants achieved reasonable precision (1 SD < or =3 mm all boundaries). The structures obscured by dental image artifacts had larger variations when measured by the 3 metrics (1 SD = 8 mm cranial/caudal boundary). Experience improved the interobserver consistency of contouring for structures obscured by artifacts (1 SD = 2 mm cranial/caudal boundary). Interobserver contouring variability for anatomic H&N structures, specifically oropharyngeal gross tumor volumes and parotid glands, was acceptable in the absence of artifacts. Dental artifacts increased the contouring variability, but experienced participants achieved reasonable precision even with artifacts present. With a staging contrast CT image as a reference, delineation on a noncontrast treatment planning CT image can achieve acceptable precision.
Siniatchkin, Michael; Moeller, Friederike; Jacobs, Julia; Stephani, Ulrich; Boor, Rainer; Wolff, Stephan; Jansen, Olav; Siebner, Hartwig; Scherg, Michael
2007-09-01
The ballistocardiogram (BCG) represents one of the most prominent sources of artifacts that contaminate the electroencephalogram (EEG) during functional MRI. The BCG artifacts may affect the detection of interictal epileptiform discharges (IED) in patients with epilepsy, reducing the sensitivity of the combined EEG-fMRI method. In this study we improved the BCG artifact correction using a multiple source correction (MSC) approach. On the one hand, a source analysis of the IEDs was applied to the EEG data obtained outside the MRI scanner to prevent the distortion of EEG signals of interest during the correction of BCG artifacts. On the other hand, the topographies of the BCG artifacts were defined based on the EEG recorded inside the scanner. The topographies of the BCG artifacts were then added to the surrogate model of IED sources and a combined source model was applied to the data obtained inside the scanner. The artifact signal was then subtracted without considerable distortion of the IED topography. The MSC approach was compared with the traditional averaged artifact subtraction (AAS) method. Both methods reduced the spectral power of BCG-related harmonics and enabled better detection of IEDs. Compared with the conventional AAS method, the MSC approach increased the sensitivity of IED detection because the IED signal was less attenuated when subtracting the BCG artifacts. The proposed MSC method is particularly useful in situations in which the BCG artifact is spatially correlated and time-locked with the EEG signal produced by the focal brain activity of interest.
Automated Classification and Removal of EEG Artifacts With SVM and Wavelet-ICA.
Sai, Chong Yeh; Mokhtar, Norrima; Arof, Hamzah; Cumming, Paul; Iwahashi, Masahiro
2018-05-01
Brain electrical activity recordings by electroencephalography (EEG) are often contaminated with signal artifacts. Procedures for automated removal of EEG artifacts are frequently sought for clinical diagnostics and brain-computer interface applications. In recent years, a combination of independent component analysis (ICA) and discrete wavelet transform has been introduced as standard technique for EEG artifact removal. However, in performing the wavelet-ICA procedure, visual inspection or arbitrary thresholding may be required for identifying artifactual components in the EEG signal. We now propose a novel approach for identifying artifactual components separated by wavelet-ICA using a pretrained support vector machine (SVM). Our method presents a robust and extendable system that enables fully automated identification and removal of artifacts from EEG signals, without applying any arbitrary thresholding. Using test data contaminated by eye blink artifacts, we show that our method performed better in identifying artifactual components than did existing thresholding methods. Furthermore, wavelet-ICA in conjunction with SVM successfully removed target artifacts, while largely retaining the EEG source signals of interest. We propose a set of features including kurtosis, variance, Shannon's entropy, and range of amplitude as training and test data of SVM to identify eye blink artifacts in EEG signals. This combinatorial method is also extendable to accommodate multiple types of artifacts present in multichannel EEG. We envision future research to explore other descriptive features corresponding to other types of artifactual components.
Naehle, Claas P; Hechelhammer, Lukas; Richter, Heiko; Ryffel, Fabian; Wildermuth, Simon; Weber, Johannes
To evaluate the effectiveness and clinical utility of a metal artifact reduction (MAR) image reconstruction algorithm for the reduction of high-attenuation object (HAO)-related image artifacts. Images were quantitatively evaluated for image noise (noiseSD and noiserange) and qualitatively for artifact severity, gray-white-matter delineation, and diagnostic confidence with conventional reconstruction and after applying a MAR algorithm. Metal artifact reduction reduces noiseSD and noiserange (median [interquartile range]) at the level of HAO in 1-cm distance compared with conventional reconstruction (noiseSD: 60.0 [71.4] vs 12.8 [16.1] and noiserange: 262.0 [236.8] vs 72.0 [28.3]; P < 0.0001). Artifact severity (reader 1 [mean ± SD]: 1.1 ± 0.6 vs 2.4 ± 0.5, reader 2: 0.8 ± 0.6 vs 2.0 ± 0.4) at level of HAO and diagnostic confidence (reader 1: 1.6 ± 0.7 vs 2.6 ± 0.5, reader 2: 1.0 ± 0.6 vs 2.3 ± 0.7) significantly improved with MAR (P < 0.0001). Metal artifact reduction did not affect gray-white-matter delineation. Metal artifact reduction effectively reduces image artifacts caused by HAO and significantly improves diagnostic confidence without worsening gray-white-matter delineation.
Preschoolers Favor the Creator's Label when Reasoning about an Artifact's Function
ERIC Educational Resources Information Center
Jaswal, Vikram K.
2006-01-01
The creator of an artifact, by virtue of having made the object, has privileged knowledge about its intended function. Do children recognize that the label an artifact's creator uses can convey this privileged information? 3- and 4-year-olds were presented with an object that looked like a member of one familiar artifact category, but which the…
ERIC Educational Resources Information Center
Brandone, Amanda C.; Gelman, Susan A.
2013-01-01
The goal of the present study was to explore domain differences in young children's expectations about the structure of animal and artifact categories. We examined 5-year-olds' and adults' use of category-referring generic noun phrases (e.g., "Birds fly") about novel animals and artifacts. The same stimuli served as both animals and artifacts;…
Fieselmann, Andreas; Dennerlein, Frank; Deuerling-Zheng, Yu; Boese, Jan; Fahrig, Rebecca; Hornegger, Joachim
2011-06-21
Filtered backprojection is the basis for many CT reconstruction tasks. It assumes constant attenuation values of the object during the acquisition of the projection data. Reconstruction artifacts can arise if this assumption is violated. For example, contrast flow in perfusion imaging with C-arm CT systems, which have acquisition times of several seconds per C-arm rotation, can cause this violation. In this paper, we derived and validated a novel spatio-temporal model to describe these kinds of artifacts. The model separates the temporal dynamics due to contrast flow from the scan and reconstruction parameters. We introduced derivative-weighted point spread functions to describe the spatial spread of the artifacts. The model allows prediction of reconstruction artifacts for given temporal dynamics of the attenuation values. Furthermore, it can be used to systematically investigate the influence of different reconstruction parameters on the artifacts. We have shown that with optimized redundancy weighting function parameters the spatial spread of the artifacts around a typical arterial vessel can be reduced by about 70%. Finally, an inversion of our model could be used as the basis for novel dynamic reconstruction algorithms that further minimize these artifacts.
[Comparison of magnetic resonance imaging artifacts of five common dental materials].
Xu, Yisheng; Yu, Risheng
2015-06-01
To compare five materials commonly used in dentistry, including three types of metals and two types of ceramics, by using different sequences of three magnetic resonance imaging (MRI) field strengths (0.35, 1.5, and 3.0 T). Three types of metals and two types of ceramics that were fabricated into the same size and thickness as an incisor crown were placed in a plastic tank filled with saline. The crowns were scanned using an magnetic resonance (MR) machine at 0.35, 1.5, and 3.0 T field strengths. The TlWI and T2WI images were obtained. The differences of various materials in different artifacts of field MR scans were determined. The zirconia crown presented no significant artifacts when scanned under the three types of MRI field strengths. The artifacts of casting ceramic were minimal. All dental precious metal alloys, nickel-chromium alloy dental porcelain, and cobalt-chromium ceramic alloy showed varying degrees of artifacts under the three MRI field strengths. Zirconia and casting ceramics present almost no or faint artifacts. By contrast, precious metal alloys, nickel-chromium alloy dental porcelain and cobalt-chromium ceramic alloy display MRI artifacts. The artifact area increase with increasing magnetic field.
Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics.
Chen, Xun; Liu, Aiping; Chen, Qiang; Liu, Yu; Zou, Liang; McKeown, Martin J
2017-09-01
Electroencephalography (EEG) recordings are frequently contaminated by both ocular and muscle artifacts. These are normally dealt with separately, by employing blind source separation (BSS) techniques relying on either second-order or higher-order statistics (SOS & HOS respectively). When HOS-based methods are used, it is usually in the setting of assuming artifacts are statistically independent to the EEG. When SOS-based methods are used, it is assumed that artifacts have autocorrelation characteristics distinct from the EEG. In reality, ocular and muscle artifacts do not completely follow the assumptions of strict temporal independence to the EEG nor completely unique autocorrelation characteristics, suggesting that exploiting HOS or SOS alone may be insufficient to remove these artifacts. Here we employ a novel BSS technique, independent vector analysis (IVA), to jointly employ HOS and SOS simultaneously to remove ocular and muscle artifacts. Numerical simulations and application to real EEG recordings were used to explore the utility of the IVA approach. IVA was superior in isolating both ocular and muscle artifacts, especially for raw EEG data with low signal-to-noise ratio, and also integrated usually separate SOS and HOS steps into a single unified step. Copyright © 2017 Elsevier Ltd. All rights reserved.
Artifact removal from EEG signals using adaptive filters in cascade
NASA Astrophysics Data System (ADS)
Garcés Correa, A.; Laciar, E.; Patiño, H. D.; Valentinuzzi, M. E.
2007-11-01
Artifacts in EEG (electroencephalogram) records are caused by various factors, like line interference, EOG (electro-oculogram) and ECG (electrocardiogram). These noise sources increase the difficulty in analyzing the EEG and to obtaining clinical information. For this reason, it is necessary to design specific filters to decrease such artifacts in EEG records. In this paper, a cascade of three adaptive filters based on a least mean squares (LMS) algorithm is proposed. The first one eliminates line interference, the second adaptive filter removes the ECG artifacts and the last one cancels EOG spikes. Each stage uses a finite impulse response (FIR) filter, which adjusts its coefficients to produce an output similar to the artifacts present in the EEG. The proposed cascade adaptive filter was tested in five real EEG records acquired in polysomnographic studies. In all cases, line-frequency, ECG and EOG artifacts were attenuated. It is concluded that the proposed filter reduces the common artifacts present in EEG signals without removing significant information embedded in these records.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Wenkun; Zhang, Hanming; Li, Lei
2016-08-15
X-ray computed tomography (CT) is a powerful and common inspection technique used for the industrial non-destructive testing. However, large-sized and heavily absorbing objects cause the formation of artifacts because of either the lack of specimen penetration in specific directions or the acquisition of data from only a limited angular range of views. Although the sparse optimization-based methods, such as the total variation (TV) minimization method, can suppress artifacts to some extent, reconstructing the images such that they converge to accurate values remains difficult because of the deficiency in continuous angular data and inconsistency in the projections. To address this problem,more » we use the idea of regional enhancement of the true values and suppression of the illusory artifacts outside the region to develop an efficient iterative algorithm. This algorithm is based on the combination of regional enhancement of the true values and TV minimization for the limited angular reconstruction. In this algorithm, the segmentation approach is introduced to distinguish the regions of different image knowledge and generate the support mask of the image. A new regularization term, which contains the support knowledge to enhance the true values of the image, is incorporated into the objective function. Then, the proposed optimization model is solved by variable splitting and the alternating direction method efficiently. A compensation approach is also designed to extract useful information from the initial projections and thus reduce false segmentation result and correct the segmentation support and the segmented image. The results obtained from comparing both simulation studies and real CT data set reconstructions indicate that the proposed algorithm generates a more accurate image than do the other reconstruction methods. The experimental results show that this algorithm can produce high-quality reconstructed images for the limited angular reconstruction and suppress the illusory artifacts caused by the deficiency in valid data.« less
Evaluation of the OSC-TV iterative reconstruction algorithm for cone-beam optical CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matenine, Dmitri, E-mail: dmitri.matenine.1@ulaval.ca; Mascolo-Fortin, Julia, E-mail: julia.mascolo-fortin.1@ulaval.ca; Goussard, Yves, E-mail: yves.goussard@polymtl.ca
Purpose: The present work evaluates an iterative reconstruction approach, namely, the ordered subsets convex (OSC) algorithm with regularization via total variation (TV) minimization in the field of cone-beam optical computed tomography (optical CT). One of the uses of optical CT is gel-based 3D dosimetry for radiation therapy, where it is employed to map dose distributions in radiosensitive gels. Model-based iterative reconstruction may improve optical CT image quality and contribute to a wider use of optical CT in clinical gel dosimetry. Methods: This algorithm was evaluated using experimental data acquired by a cone-beam optical CT system, as well as complementary numericalmore » simulations. A fast GPU implementation of OSC-TV was used to achieve reconstruction times comparable to those of conventional filtered backprojection. Images obtained via OSC-TV were compared with the corresponding filtered backprojections. Spatial resolution and uniformity phantoms were scanned and respective reconstructions were subject to evaluation of the modulation transfer function, image uniformity, and accuracy. The artifacts due to refraction and total signal loss from opaque objects were also studied. Results: The cone-beam optical CT data reconstructions showed that OSC-TV outperforms filtered backprojection in terms of image quality, thanks to a model-based simulation of the photon attenuation process. It was shown to significantly improve the image spatial resolution and reduce image noise. The accuracy of the estimation of linear attenuation coefficients remained similar to that obtained via filtered backprojection. Certain image artifacts due to opaque objects were reduced. Nevertheless, the common artifact due to the gel container walls could not be eliminated. Conclusions: The use of iterative reconstruction improves cone-beam optical CT image quality in many ways. The comparisons between OSC-TV and filtered backprojection presented in this paper demonstrate that OSC-TV can potentially improve the rendering of spatial features and reduce cone-beam optical CT artifacts.« less
Chang, Po-Chun; Seol, Yang-Jo; Goldstein, Steven A.; Giannobile, William V.
2014-01-01
Purpose It is currently a challenge to determine the biomechanical properties of the hard tissue–dental implant interface. Recent advances in intraoral imaging and tomographic methods, such as microcomputed tomography (micro-CT), provide three-dimensional details, offering significant potential to evaluate the bone-implant interface, but yield limited information regarding osseointegration because of physical scattering effects emanating from metallic implant surfaces. In the present study, it was hypothesized that functional apparent moduli (FAM), generated from functional incorporation of the peri-implant structure, would eliminate the radiographic artifact–affected layer and serve as a feasible means to evaluate the biomechanical dynamics of tissue-implant integration in vivo. Materials and Methods Cylindric titanium mini-implants were placed in osteotomies and osteotomies with defects in rodent maxillae. The layers affected by radiographic artifacts were identified, and the pattern of tissue-implant integration was evaluated from histology and micro-CT images over a 21-day observation period. Analyses of structural information, FAM, and the relationship between FAM and interfacial stiffness (IS) were done before and after eliminating artifacts. Results Physical artifacts were present within a zone of about 100 to 150 μm around the implant in both experimental defect situations (osteotomy alone and osteotomy + defect). All correlations were evaluated before and after eliminating the artifact-affected layers, most notably during the maturation period of osseointegration. A strong correlation existed between functional bone apparent modulus and IS within 300 μm at the osteotomy defects (r > 0.9) and functional composite tissue apparent modulus in the osteotomy defects (r > 0.75). Conclusion Micro-CT imaging and FAM were of value in measuring the temporal process of tissue-implant integration in vivo. This approach will be useful to complement imaging technologies for longitudinal monitoring of osseointegration. PMID:23377049
Evaluation of the OSC-TV iterative reconstruction algorithm for cone-beam optical CT.
Matenine, Dmitri; Mascolo-Fortin, Julia; Goussard, Yves; Després, Philippe
2015-11-01
The present work evaluates an iterative reconstruction approach, namely, the ordered subsets convex (OSC) algorithm with regularization via total variation (TV) minimization in the field of cone-beam optical computed tomography (optical CT). One of the uses of optical CT is gel-based 3D dosimetry for radiation therapy, where it is employed to map dose distributions in radiosensitive gels. Model-based iterative reconstruction may improve optical CT image quality and contribute to a wider use of optical CT in clinical gel dosimetry. This algorithm was evaluated using experimental data acquired by a cone-beam optical CT system, as well as complementary numerical simulations. A fast GPU implementation of OSC-TV was used to achieve reconstruction times comparable to those of conventional filtered backprojection. Images obtained via OSC-TV were compared with the corresponding filtered backprojections. Spatial resolution and uniformity phantoms were scanned and respective reconstructions were subject to evaluation of the modulation transfer function, image uniformity, and accuracy. The artifacts due to refraction and total signal loss from opaque objects were also studied. The cone-beam optical CT data reconstructions showed that OSC-TV outperforms filtered backprojection in terms of image quality, thanks to a model-based simulation of the photon attenuation process. It was shown to significantly improve the image spatial resolution and reduce image noise. The accuracy of the estimation of linear attenuation coefficients remained similar to that obtained via filtered backprojection. Certain image artifacts due to opaque objects were reduced. Nevertheless, the common artifact due to the gel container walls could not be eliminated. The use of iterative reconstruction improves cone-beam optical CT image quality in many ways. The comparisons between OSC-TV and filtered backprojection presented in this paper demonstrate that OSC-TV can potentially improve the rendering of spatial features and reduce cone-beam optical CT artifacts.
NASA Astrophysics Data System (ADS)
Zhang, Wenkun; Zhang, Hanming; Li, Lei; Wang, Linyuan; Cai, Ailong; Li, Zhongguo; Yan, Bin
2016-08-01
X-ray computed tomography (CT) is a powerful and common inspection technique used for the industrial non-destructive testing. However, large-sized and heavily absorbing objects cause the formation of artifacts because of either the lack of specimen penetration in specific directions or the acquisition of data from only a limited angular range of views. Although the sparse optimization-based methods, such as the total variation (TV) minimization method, can suppress artifacts to some extent, reconstructing the images such that they converge to accurate values remains difficult because of the deficiency in continuous angular data and inconsistency in the projections. To address this problem, we use the idea of regional enhancement of the true values and suppression of the illusory artifacts outside the region to develop an efficient iterative algorithm. This algorithm is based on the combination of regional enhancement of the true values and TV minimization for the limited angular reconstruction. In this algorithm, the segmentation approach is introduced to distinguish the regions of different image knowledge and generate the support mask of the image. A new regularization term, which contains the support knowledge to enhance the true values of the image, is incorporated into the objective function. Then, the proposed optimization model is solved by variable splitting and the alternating direction method efficiently. A compensation approach is also designed to extract useful information from the initial projections and thus reduce false segmentation result and correct the segmentation support and the segmented image. The results obtained from comparing both simulation studies and real CT data set reconstructions indicate that the proposed algorithm generates a more accurate image than do the other reconstruction methods. The experimental results show that this algorithm can produce high-quality reconstructed images for the limited angular reconstruction and suppress the illusory artifacts caused by the deficiency in valid data.
Ex Vivo Artifacts and Histopathologic Pitfalls in the Lung.
Thunnissen, Erik; Blaauwgeers, Hans J L G; de Cuba, Erienne M V; Yick, Ching Yong; Flieder, Douglas B
2016-03-01
Surgical and pathologic handling of lung physically affects lung tissue. This leads to artifacts that alter the morphologic appearance of pulmonary parenchyma. To describe and illustrate mechanisms of ex vivo artifacts that may lead to diagnostic pitfalls. In this study 4 mechanisms of ex vivo artifacts and corresponding diagnostic pitfalls are described and illustrated. The 4 patterns of artifacts are: (1) surgical collapse, due to the removal of air and blood from pulmonary resections; (2) ex vivo contraction of bronchial and bronchiolar smooth muscle; (3) clamping edema of open lung biopsies; and (4) spreading of tissue fragments and individual cells through a knife surface. Morphologic pitfalls include diagnostic patterns of adenocarcinoma, asthma, constrictive bronchiolitis, and lymphedema. Four patterns of pulmonary ex vivo artifacts are important to recognize in order to avoid morphologic misinterpretations.
Artifacts, intentions, and contraceptives: the problem with having a plan B for plan B.
Reed, Philip A
2013-12-01
It is commonly proposed that artifacts cannot be understood without reference to human intentions. This fact, I contend, has relevance to the use of artifacts in intentional action. I argue that because artifacts have intentions embedded into them antecedently, when we use artifacts we are sometimes compelled to intend descriptions of our actions that we might, for various reasons, be inclined to believe that we do not intend. I focus this argument to a specific set of artifacts, namely, medical devices, before considering an extended application to emergency contraceptive devices. Although there is some debate about whether emergency contraception has an abortifacient effect, I argue that if there is an abortifacient effect, then the effect cannot normally be a side effect of one's action.
Evaluation of motion artifact metrics for coronary CT angiography.
Ma, Hongfeng; Gros, Eric; Szabo, Aniko; Baginski, Scott G; Laste, Zachary R; Kulkarni, Naveen M; Okerlund, Darin; Schmidt, Taly G
2018-02-01
This study quantified the performance of coronary artery motion artifact metrics relative to human observer ratings. Motion artifact metrics have been used as part of motion correction and best-phase selection algorithms for Coronary Computed Tomography Angiography (CCTA). However, the lack of ground truth makes it difficult to validate how well the metrics quantify the level of motion artifact. This study investigated five motion artifact metrics, including two novel metrics, using a dynamic phantom, clinical CCTA images, and an observer study that provided ground-truth motion artifact scores from a series of pairwise comparisons. Five motion artifact metrics were calculated for the coronary artery regions on both phantom and clinical CCTA images: positivity, entropy, normalized circularity, Fold Overlap Ratio (FOR), and Low-Intensity Region Score (LIRS). CT images were acquired of a dynamic cardiac phantom that simulated cardiac motion and contained six iodine-filled vessels of varying diameter and with regions of soft plaque and calcifications. Scans were repeated with different gantry start angles. Images were reconstructed at five phases of the motion cycle. Clinical images were acquired from 14 CCTA exams with patient heart rates ranging from 52 to 82 bpm. The vessel and shading artifacts were manually segmented by three readers and combined to create ground-truth artifact regions. Motion artifact levels were also assessed by readers using a pairwise comparison method to establish a ground-truth reader score. The Kendall's Tau coefficients were calculated to evaluate the statistical agreement in ranking between the motion artifacts metrics and reader scores. Linear regression between the reader scores and the metrics was also performed. On phantom images, the Kendall's Tau coefficients of the five motion artifact metrics were 0.50 (normalized circularity), 0.35 (entropy), 0.82 (positivity), 0.77 (FOR), 0.77(LIRS), where higher Kendall's Tau signifies higher agreement. The FOR, LIRS, and transformed positivity (the fourth root of the positivity) were further evaluated in the study of clinical images. The Kendall's Tau coefficients of the selected metrics were 0.59 (FOR), 0.53 (LIRS), and 0.21 (Transformed positivity). In the study of clinical data, a Motion Artifact Score, defined as the product of FOR and LIRS metrics, further improved agreement with reader scores, with a Kendall's Tau coefficient of 0.65. The metrics of FOR, LIRS, and the product of the two metrics provided the highest agreement in motion artifact ranking when compared to the readers, and the highest linear correlation to the reader scores. The validated motion artifact metrics may be useful for developing and evaluating methods to reduce motion in Coronary Computed Tomography Angiography (CCTA) images. © 2017 American Association of Physicists in Medicine.
A mixed-order nonlinear diffusion compressed sensing MR image reconstruction.
Joy, Ajin; Paul, Joseph Suresh
2018-03-07
Avoid formation of staircase artifacts in nonlinear diffusion-based MR image reconstruction without compromising computational speed. Whereas second-order diffusion encourages the evolution of pixel neighborhood with uniform intensities, fourth-order diffusion considers smooth region to be not necessarily a uniform intensity region but also a planar region. Therefore, a controlled application of fourth-order diffusivity function is used to encourage second-order diffusion to reconstruct the smooth regions of the image as a plane rather than a group of blocks, while not being strong enough to introduce the undesirable speckle effect. Proposed method is compared with second- and fourth-order nonlinear diffusion reconstruction, total variation (TV), total generalized variation, and higher degree TV using in vivo data sets for different undersampling levels with application to dictionary learning-based reconstruction. It is observed that the proposed technique preserves sharp boundaries in the image while preventing the formation of staircase artifacts in the regions of smoothly varying pixel intensities. It also shows reduced error measures compared with second-order nonlinear diffusion reconstruction or TV and converges faster than TV-based methods. Because nonlinear diffusion is known to be an effective alternative to TV for edge-preserving reconstruction, the crucial aspect of staircase artifact removal is addressed. Reconstruction is found to be stable for the experimentally determined range of fourth-order regularization parameter, and therefore not does not introduce a parameter search. Hence, the computational simplicity of second-order diffusion is retained. © 2018 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Abu Anas, Emran Mohammad; Kim, Jae Gon; Lee, Soo Yeol; Kamrul Hasan, Md
2011-10-01
The use of an x-ray flat panel detector is increasingly becoming popular in 3D cone beam volume CT machines. Due to the deficient semiconductor array manufacturing process, the cone beam projection data are often corrupted by different types of abnormalities, which cause severe ring and radiant artifacts in a cone beam reconstruction image, and as a result, the diagnostic image quality is degraded. In this paper, a novel technique is presented for the correction of error in the 2D cone beam projections due to abnormalities often observed in 2D x-ray flat panel detectors. Template images are derived from the responses of the detector pixels using their statistical properties and then an effective non-causal derivative-based detection algorithm in 2D space is presented for the detection of defective and mis-calibrated detector elements separately. An image inpainting-based 3D correction scheme is proposed for the estimation of responses of defective detector elements, and the responses of the mis-calibrated detector elements are corrected using the normalization technique. For real-time implementation, a simplification of the proposed off-line method is also suggested. Finally, the proposed algorithms are tested using different real cone beam volume CT images and the experimental results demonstrate that the proposed methods can effectively remove ring and radiant artifacts from cone beam volume CT images compared to other reported techniques in the literature.
Spectral CT Image Restoration via an Average Image-Induced Nonlocal Means Filter.
Zeng, Dong; Huang, Jing; Zhang, Hua; Bian, Zhaoying; Niu, Shanzhou; Zhang, Zhang; Feng, Qianjin; Chen, Wufan; Ma, Jianhua
2016-05-01
Spectral computed tomography (SCT) images reconstructed by an analytical approach often suffer from a poor signal-to-noise ratio and strong streak artifacts when sufficient photon counts are not available in SCT imaging. In reducing noise-induced artifacts in SCT images, in this study, we propose an average image-induced nonlocal means (aviNLM) filter for each energy-specific image restoration. Methods: The present aviNLM algorithm exploits redundant information in the whole energy domain. Specifically, the proposed aviNLM algorithm yields the restored results by performing a nonlocal weighted average operation on the noisy energy-specific images with the nonlocal weight matrix between the target and prior images, in which the prior image is generated from all of the images reconstructed in each energy bin. Results: Qualitative and quantitative studies are conducted to evaluate the aviNLM filter by using the data of digital phantom, physical phantom, and clinical patient data acquired from the energy-resolved and -integrated detectors, respectively. Experimental results show that the present aviNLM filter can achieve promising results for SCT image restoration in terms of noise-induced artifact suppression, cross profile, and contrast-to-noise ratio and material decomposition assessment. Conclusion and Significance: The present aviNLM algorithm has useful potential for radiation dose reduction by lowering the mAs in SCT imaging, and it may be useful for some other clinical applications, such as in myocardial perfusion imaging and radiotherapy.
Oh, Se-Hong; Chung, Jun-Young; In, Myung-Ho; Zaitsev, Maxim; Kim, Young-Bo; Speck, Oliver; Cho, Zang-Hee
2012-10-01
Despite its wide use, echo-planar imaging (EPI) suffers from geometric distortions due to off-resonance effects, i.e., strong magnetic field inhomogeneity and susceptibility. This article reports a novel method for correcting the distortions observed in EPI acquired at ultra-high-field such as 7 T. Point spread function (PSF) mapping methods have been proposed for correcting the distortions in EPI. The PSF shift map can be derived either along the nondistorted or the distorted coordinates. Along the nondistorted coordinates more information about compressed areas is present but it is prone to PSF-ghosting artifacts induced by large k-space shift in PSF encoding direction. In contrast, shift maps along the distorted coordinates contain more information in stretched areas and are more robust against PSF-ghosting. In ultra-high-field MRI, an EPI contains both compressed and stretched regions depending on the B0 field inhomogeneity and local susceptibility. In this study, we present a new geometric distortion correction scheme, which selectively applies the shift map with more information content. We propose a PSF-ghost elimination method to generate an artifact-free pixel shift map along nondistorted coordinates. The proposed method can correct the effects of the local magnetic field inhomogeneity induced by the susceptibility effects along with the PSF-ghost artifact cancellation. We have experimentally demonstrated the advantages of the proposed method in EPI data acquisitions in phantom and human brain using 7-T MRI. Copyright © 2011 Wiley Periodicals, Inc.
Pervasive access to MRI bias artifact suppression service on a grid.
Ardizzone, Edoardo; Gambino, Orazio; Genco, Alessandro; Pirrone, Roberto; Sorce, Salvatore
2009-01-01
Bias artifact corrupts MRIs in such a way that the image is afflicted by illumination variations. Some of the authors proposed the exponential entropy-driven homomorphic unsharp masking ( E(2)D-HUM) algorithm that corrects this artifact without any a priori hypothesis about the tissues or the MRI modality. Moreover, E(2)D-HUM does not care about the body part under examination and does not require any particular training task. People who want to use this algorithm, which is Matlab-based, have to set their own computers in order to execute it. Furthermore, they have to be Matlab-skilled to exploit all the features of the algorithm. In this paper, we propose to make such algorithm available as a service on a grid infrastructure, so that people can use it almost from everywhere, in a pervasive fashion, by means of a suitable user interface running on smartphones. The proposed solution allows physicians to use the E(2)D-HUM algorithm (or any other kind of algorithm, given that it is available as a service on the grid), being it remotely executed somewhere in the grid, and the results are sent back to the user's device. This way, physicians do not need to be aware of how to use Matlab to process their images. The pervasive service provision for medical image enhancement is presented, along with some experimental results obtained using smartphones connected to an existing Globus-based grid infrastructure.
Limited-angle multi-energy CT using joint clustering prior and sparsity regularization
NASA Astrophysics Data System (ADS)
Zhang, Huayu; Xing, Yuxiang
2016-03-01
In this article, we present an easy-to-implement Multi-energy CT scanning strategy and a corresponding reconstruction method, which facilitate spectral CT imaging by improving the data efficiency the number-of-energy- channel fold without introducing visible limited-angle artifacts caused by reducing projection views. Leveraging the structure coherence at different energies, we first pre-reconstruct a prior structure information image using projection data from all energy channels. Then, we perform a k-means clustering on the prior image to generate a sparse dictionary representation for the image, which severs as a structure information constraint. We com- bine this constraint with conventional compressed sensing method and proposed a new model which we referred as Joint Clustering Prior and Sparsity Regularization (CPSR). CPSR is a convex problem and we solve it by Alternating Direction Method of Multipliers (ADMM). We verify our CPSR reconstruction method with a numerical simulation experiment. A dental phantom with complicate structures of teeth and soft tissues is used. X-ray beams from three spectra of different peak energies (120kVp, 90kVp, 60kVp) irradiate the phantom to form tri-energy projections. Projection data covering only 75◦ from each energy spectrum are collected for reconstruction. Independent reconstruction for each energy will cause severe limited-angle artifacts even with the help of compressed sensing approaches. Our CPSR provides us with images free of the limited-angle artifact. All edge details are well preserved in our experimental study.
NASA Astrophysics Data System (ADS)
Shen, Zhengwei; Cheng, Lishuang
2017-09-01
Total variation (TV)-based image deblurring method can bring on staircase artifacts in the homogenous region of the latent images recovered from the degraded images while a wavelet/frame-based image deblurring method will lead to spurious noise spikes and pseudo-Gibbs artifacts in the vicinity of discontinuities of the latent images. To suppress these artifacts efficiently, we propose a nonconvex composite wavelet/frame and TV-based image deblurring model. In this model, the wavelet/frame and the TV-based methods may complement each other, which are verified by theoretical analysis and experimental results. To further improve the quality of the latent images, nonconvex penalty function is used to be the regularization terms of the model, which may induce a stronger sparse solution and will more accurately estimate the relative large gradient or wavelet/frame coefficients of the latent images. In addition, by choosing a suitable parameter to the nonconvex penalty function, the subproblem that splits by the alternative direction method of multipliers algorithm from the proposed model can be guaranteed to be a convex optimization problem; hence, each subproblem can converge to a global optimum. The mean doubly augmented Lagrangian and the isotropic split Bregman algorithms are used to solve these convex subproblems where the designed proximal operator is used to reduce the computational complexity of the algorithms. Extensive numerical experiments indicate that the proposed model and algorithms are comparable to other state-of-the-art model and methods.
Experimental Evidence of Weak Excluded Volume Effects for Nanochannel Confined DNA
NASA Astrophysics Data System (ADS)
Gupta, Damini; Miller, Jeremy J.; Muralidhar, Abhiram; Mahshid, Sara; Reisner, Walter; Dorfman, Kevin D.
In the classical de Gennes picture of weak polymer nanochannel confinement, the polymer contour is envisioned as divided into a series of isometric blobs. Strong excluded volume interactions are present both within a blob and between blobs. In contrast, for semiflexible polymers like DNA, excluded volume interactions are of borderline strength within a blob but appreciable between blobs, giving rise to a chain description consisting of a string of anisometric blobs. We present experimental validation of this subtle effect of excluded volume for DNA nanochannel confinement by performing measurements of variance in chain extension of T4 DNA molecules as a function of effective nanochannel size (305-453 nm). Additionally, we show an approach to systematically reduce the effect of molecular weight dispersity of DNA samples, a typical experimental artifact, by combining confinement spectroscopy with simulations.
Research methodology of the artifact effect in the blood to the result of cell classification
NASA Astrophysics Data System (ADS)
Polyakov, E. V.; Nikitaev, V. G.; Seldyukov, S. O.
2017-01-01
A study of the influence of artifacts on the result of the division of blasts and lymphocytes in the problem of diagnosing the types of acute leukemia was conducted. A group of artifacts was formed to conduct the study. Preliminary studies allowed to estimate the degree of influence of artifacts on the results of the classification of red blood cells.
Keep Your Eye on the Ball: Investigating Artifacts-in-Use in Physical Education
ERIC Educational Resources Information Center
Quennerstedt, Mikael; Almqvist, Jonas; Ohman, Marie
2011-01-01
The purpose of this article is to develop a method of approach that can be used to explore the meaning and use of artifacts in education by applying a socio-cultural perspective to learning and artifacts. An empirical material of video recorded physical education lessons in Sweden is used to illustrate the approach in terms of how artifacts in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altunbas, Cem, E-mail: caltunbas@gmail.com; Lai, Chao-Jen; Zhong, Yuncheng
Purpose: In using flat panel detectors (FPD) for cone beam computed tomography (CBCT), pixel gain variations may lead to structured nonuniformities in projections and ring artifacts in CBCT images. Such gain variations can be caused by change in detector entrance exposure levels or beam hardening, and they are not accounted by conventional flat field correction methods. In this work, the authors presented a method to identify isolated pixel clusters that exhibit gain variations and proposed a pixel gain correction (PGC) method to suppress both beam hardening and exposure level dependent gain variations. Methods: To modulate both beam spectrum and entrancemore » exposure, flood field FPD projections were acquired using beam filters with varying thicknesses. “Ideal” pixel values were estimated by performing polynomial fits in both raw and flat field corrected projections. Residuals were calculated by taking the difference between measured and ideal pixel values to identify clustered image and FPD artifacts in flat field corrected and raw images, respectively. To correct clustered image artifacts, the ratio of ideal to measured pixel values in filtered images were utilized as pixel-specific gain correction factors, referred as PGC method, and they were tabulated as a function of pixel value in a look-up table. Results: 0.035% of detector pixels lead to clustered image artifacts in flat field corrected projections, where 80% of these pixels were traced back and linked to artifacts in the FPD. The performance of PGC method was tested in variety of imaging conditions and phantoms. The PGC method reduced clustered image artifacts and fixed pattern noise in projections, and ring artifacts in CBCT images. Conclusions: Clustered projection image artifacts that lead to ring artifacts in CBCT can be better identified with our artifact detection approach. When compared to the conventional flat field correction method, the proposed PGC method enables characterization of nonlinear pixel gain variations as a function of change in x-ray spectrum and intensity. Hence, it can better suppress image artifacts due to beam hardening as well as artifacts that arise from detector entrance exposure variation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kadbi, M
Purpose: Utilization of Titanium Tandem and Ring (T&R) applicators in MR-guided brachytherapy has become widespread for gynecological cancer treatment. However, Titanium causes magnetic field disturbance and susceptibility artifact, which complicate image interpretation. In this study, metal artifact reduction techniques were employed to improve the image quality and reduce the metal related artifacts. Methods: Several techniques were employed to reduce the metal artifact caused by titanium T&R applicator. These techniques include Metal Artifact Reduction Sequence (MARS), View Angle Tilting (VAT) to correct in-plane distortion, and Slice Encoding for Metal Artifact Correction (SEMAC) for through-plane artifact correction. Moreover, MARS can be combinedmore » with VAT to further reduce the in-plane artifact by reapplying the selection gradients during the readout (MARS+VAT). SEMAC uses a slice selective excitation but acquires additional z-encodings in order to resolve off-resonant signal and to reduce through-plane distortions. Results: Comparison between the clinical sequences revealed that increasing the bandwidth reduces the error in measured diameter of T&R. However, the error is larger than 4mm for the best case with highest bandwidth and spatial resolution. MARS+VAT with isotropic resolution of 1mm reduced the error to 1.9mm which is the least among the examined 2D sequences. The measured diameter of tandem from SEMAC+VAT has the closest value to the actual diameter of tandem (3.2mm) and the error was reduced to less than 1mm. In addition, SEMAC+VAT significantly reduces the blooming artifact in the ring compared to clinical sequences. Conclusion: A higher bandwidth and spatial resolution sequence reduces the artifact and diameter of applicator with a slight compromise in SNR. Metal artifact reduction sequences decrease the distortion associated with titanium applicator. SEMAC+VAT sequence in combination with VAT revealed promising results for titanium imaging and can be utilized for MR-guided brachytherapy in gynecological cancer. The author is employee with Philips Healthcare.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, X; Yang, X; Rosenfield, J
Purpose: Metal implants such as orthopedic hardware and dental fillings cause severe bright and dark streaking in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. Additionally, such artifacts negatively impact patient set-up in image guided radiation therapy (IGRT). In this work, we propose a novel method for metal artifact reduction which utilizes the anatomical similarity between neighboring CT slices. Methods: Neighboring CT slices show similar anatomy. Based on this anatomical similarity, the proposed method replaces corrupted CT pixels with pixels from adjacent, artifact-free slices. A gamma map,more » which is the weighted summation of relative HU error and distance error, is calculated for each pixel in the artifact-corrupted CT image. The minimum value in each pixel’s gamma map is used to identify a pixel from the adjacent CT slice to replace the corresponding artifact-corrupted pixel. This replacement only occurs if the minimum value in a particular pixel’s gamma map is larger than a threshold. The proposed method was evaluated with clinical images. Results: Highly attenuating dental fillings and hip implants cause severe streaking artifacts on CT images. The proposed method eliminates the dark and bright streaking and improves the implant delineation and visibility. In particular, the image non-uniformity in the central region of interest was reduced from 1.88 and 1.01 to 0.28 and 0.35, respectively. Further, the mean CT HU error was reduced from 328 HU and 460 HU to 60 HU and 36 HU, respectively. Conclusions: The proposed metal artifact reduction method replaces corrupted image pixels with pixels from neighboring slices that are free of metal artifacts. This method proved capable of suppressing streaking artifacts, improving HU accuracy and image detectability.« less
WE-AB-207A-12: HLCC Based Quantitative Evaluation Method of Image Artifact in Dental CBCT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Y; Wu, S; Qi, H
Purpose: Image artifacts are usually evaluated qualitatively via visual observation of the reconstructed images, which is susceptible to subjective factors due to the lack of an objective evaluation criterion. In this work, we propose a Helgason-Ludwig consistency condition (HLCC) based evaluation method to quantify the severity level of different image artifacts in dental CBCT. Methods: Our evaluation method consists of four step: 1) Acquire Cone beam CT(CBCT) projection; 2) Convert 3D CBCT projection to fan-beam projection by extracting its central plane projection; 3) Convert fan-beam projection to parallel-beam projection utilizing sinogram-based rebinning algorithm or detail-based rebinning algorithm; 4) Obtain HLCCmore » profile by integrating parallel-beam projection per view and calculate wave percentage and variance of the HLCC profile, which can be used to describe the severity level of image artifacts. Results: Several sets of dental CBCT projections containing only one type of artifact (i.e. geometry, scatter, beam hardening, lag and noise artifact), were simulated using gDRR, a GPU tool developed for efficient, accurate, and realistic simulation of CBCT Projections. These simulated CBCT projections were used to test our proposed method. HLCC profile wave percentage and variance induced by geometry distortion are about 3∼21 times and 16∼393 times as large as that of the artifact-free projection, respectively. The increase factor of wave percentage and variance are 6 and133 times for beam hardening, 19 and 1184 times for scatter, and 4 and16 times for lag artifacts, respectively. In contrast, for noisy projection the wave percentage, variance and inconsistency level are almost the same with those of the noise-free one. Conclusion: We have proposed a quantitative evaluation method of image artifact based on HLCC theory. According to our simulation results, the severity of different artifact types is found to be in a following order: Scatter>Geometry>Beam hardening>Lag>Noise>Artifact-free in dental CBCT.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korpics, Mark; Surucu, Murat; Mescioglu, Ibrahim
Purpose and Objectives: To quantify, through an observer study, the reduction in metal artifacts on cone beam computed tomographic (CBCT) images using a projection-interpolation algorithm, on images containing metal artifacts from dental fillings and implants in patients treated for head and neck (H&N) cancer. Methods and Materials: An interpolation-substitution algorithm was applied to H&N CBCT images containing metal artifacts from dental fillings and implants. Image quality with respect to metal artifacts was evaluated subjectively and objectively. First, 6 independent radiation oncologists were asked to rank randomly sorted blinded images (before and after metal artifact reduction) using a 5-point rating scalemore » (1 = severe artifacts; 5 = no artifacts). Second, the standard deviation of different regions of interest (ROI) within each image was calculated and compared with the mean rating scores. Results: The interpolation-substitution technique successfully reduced metal artifacts in 70% of the cases. From a total of 60 images from 15 H&N cancer patients undergoing image guided radiation therapy, the mean rating score on the uncorrected images was 2.3 ± 1.1, versus 3.3 ± 1.0 for the corrected images. The mean difference in ranking score between uncorrected and corrected images was 1.0 (95% confidence interval: 0.9-1.2, P<.05). The standard deviation of each ROI significantly decreased after artifact reduction (P<.01). Moreover, a negative correlation between the mean rating score for each image and the standard deviation of the oral cavity and bilateral cheeks was observed. Conclusion: The interpolation-substitution algorithm is efficient and effective for reducing metal artifacts caused by dental fillings and implants on CBCT images, as demonstrated by the statistically significant increase in observer image quality ranking and by the decrease in ROI standard deviation between uncorrected and corrected images.« less
MSVAT-SPACE-STIR and SEMAC-STIR for Reduction of Metallic Artifacts in 3T Head and Neck MRI.
Hilgenfeld, T; Prager, M; Schwindling, F S; Nittka, M; Rammelsberg, P; Bendszus, M; Heiland, S; Juerchott, A
2018-05-24
The incidence of metallic dental restorations and implants is increasing, and head and neck MR imaging is becoming challenging regarding artifacts. Our aim was to evaluate whether multiple-slab acquisition with view angle tilting gradient based on a sampling perfection with application-optimized contrasts by using different flip angle evolution (MSVAT-SPACE)-STIR and slice-encoding for metal artifact correction (SEMAC)-STIR are beneficial regarding artifact suppression compared with the SPACE-STIR and TSE-STIR in vitro and in vivo. At 3T, 3D artifacts of 2 dental implants, supporting different single crowns, were evaluated. Image quality was evaluated quantitatively (normalized signal-to-noise ratio) and qualitatively (2 reads by 2 blinded radiologists). Feasibility was tested in vivo in 5 volunteers and 5 patients, respectively. Maximum achievable resolution and the normalized signal-to-noise ratio of MSVAT-SPACE-STIR were higher compared with SEMAC-STIR. Performance in terms of artifact correction was dependent on the material composition. For highly paramagnetic materials, SEMAC-STIR was superior to MSVAT-SPACE-STIR (27.8% smaller artifact volume) and TSE-STIR (93.2% less slice distortion). However, MSVAT-SPACE-STIR reduced the artifact size compared with SPACE-STIR by 71.5%. For low-paramagnetic materials, MSVAT-SPACE-STIR performed as well as SEMAC-STIR. Furthermore, MSVAT-SPACE-STIR decreased artifact volume by 69.5% compared with SPACE-STIR. The image quality of all sequences did not differ systematically. In vivo results were comparable with in vitro results. Regarding susceptibility artifacts and acquisition time, MSVAT-SPACE-STIR might be advantageous over SPACE-STIR for high-resolution and isotropic head and neck imaging. Only for materials with high-susceptibility differences to soft tissue, the use of SEMAC-STIR might be beneficial. Within limited acquisition times, SEMAC-STIR cannot exploit its full advantage over TSE-STIR regarding artifact suppression. © 2018 by American Journal of Neuroradiology.
Hoorweg, Anne-Lee J; Pasma, Wietze; van Wolfswinkel, Leo; de Graaff, Jurgen C
2018-02-01
Vital parameter data collected in anesthesia information management systems are often used for clinical research. The validity of this type of research is dependent on the number of artifacts. In this prospective observational cohort study, the incidence of artifacts in anesthesia information management system data was investigated in children undergoing anesthesia for noncardiac procedures. Secondary outcomes included the incidence of artifacts among deviating and nondeviating values, among the anesthesia phases, and among different anesthetic techniques. We included 136 anesthetics representing 10,236 min of anesthesia time. The incidence of artifacts was 0.5% for heart rate (95% CI: 0.4 to 0.7%), 1.3% for oxygen saturation (1.1 to 1.5%), 7.5% for end-tidal carbon dioxide (6.9 to 8.0%), 5.0% for noninvasive blood pressure (4.0 to 6.0%), and 7.3% for invasive blood pressure (5.9 to 8.8%). The incidence of artifacts among deviating values was 3.1% for heart rate (2.1 to 4.4%), 10.8% for oxygen saturation (7.6 to 14.8%), 14.1% for end-tidal carbon dioxide (13.0 to 15.2%), 14.4% for noninvasive blood pressure (10.3 to 19.4%), and 38.4% for invasive blood pressure (30.3 to 47.1%). Not all values in anesthesia information management systems are valid. The incidence of artifacts stored in the present pediatric anesthesia practice was low for heart rate and oxygen saturation, whereas noninvasive and invasive blood pressure and end-tidal carbon dioxide had higher artifact incidences. Deviating values are more often artifacts than values in a normal range, and artifacts are associated with the phase of anesthesia and anesthetic technique. Development of (automatic) data validation systems or solutions to deal with artifacts in data is warranted.
Yuan, Fu-song; Sun, Yu-chun; Xie, Xiao-yan; Wang, Yong; Lv, Pei-jun
2013-12-18
To quantitatively evaluate the artifacts appearance of eight kinds of common dental restorative materials, such as zirconia. For the full-crown tooth preparation of mandibular first molar, eight kinds of full-crowns, such as zirconia all-ceramic crown, glass ceramic crown, ceramage crown, Au-Pt based porcelain-fused-metal (PFM) crown, Pure Titanium PFM crown, Co-Cr PFM crown, Ni-Cr PFM crown, and Au-Pd metal crown were fabricated. And natural teeth in vitro were used as controls. These full-crown and natural teeth in vitro were mounted an ultraviolet-curable resin fixed plate. High resolution cone beam computed tomography (CBCT) was used to scan all of the crowns and natural teeth in vitro, and their DICOM data were imported into software MIMICS 10.0. Then, the number of stripes and the maximum diameters of artifacts around the full-crowns were evaluated quantitatively in two-dimensional tomography images. In the two-dimensional tomography images,the artifacts did not appear around the natural teeth in vitro, glass ceramic crown, and ceramage crown. But thr artifacts appeared around the zirconia all-ceramic and metal crown. The number of stripes of artifacts was five to nine per one crown. The maximum diameters of the artifacts were 2.4 to 2.6 cm and 2.2 to 2.7 cm. In the two-dimensional tomography images of CBCT, stripe-like and radical artifacts were caused around the zirconia all-ceramic crown and metal based porcelain-fused-metal crowns. These artifacts could lower the imaging quality of the full crown shape greatly. The artifact was not caused around the natural teeth in vitro, glass ceramic crown, and ceramage crown.
Artifact removal in the context of group ICA: a comparison of single-subject and group approaches
Du, Yuhui; Allen, Elena A.; He, Hao; Sui, Jing; Wu, Lei; Calhoun, Vince D.
2018-01-01
Independent component analysis (ICA) has been widely applied to identify intrinsic brain networks from fMRI data. Group ICA computes group-level components from all data and subsequently estimates individual-level components to recapture inter-subject variability. However, the best approach to handle artifacts, which may vary widely among subjects, is not yet clear. In this work, we study and compare two ICA approaches for artifacts removal. One approach, recommended in recent work by the Human Connectome Project, first performs ICA on individual subject data to remove artifacts, and then applies a group ICA on the cleaned data from all subjects. We refer to this approach as Individual ICA based artifacts Removal Plus Group ICA (IRPG). A second proposed approach, called Group Information Guided ICA (GIG-ICA), performs ICA on group data, then removes the group-level artifact components, and finally performs subject-specific ICAs using the group-level non-artifact components as spatial references. We used simulations to evaluate the two approaches with respect to the effects of data quality, data quantity, variable number of sources among subjects, and spatially unique artifacts. Resting-state test-retest datasets were also employed to investigate the reliability of functional networks. Results from simulations demonstrate GIG-ICA has greater performance compared to IRPG, even in the case when single-subject artifacts removal is perfect and when individual subjects have spatially unique artifacts. Experiments using test-retest data suggest that GIG-ICA provides more reliable functional networks. Based on high estimation accuracy, ease of implementation, and high reliability of functional networks, we find GIG-ICA to be a promising approach. PMID:26859308
Le, Yuan; Kipfer, Hal D; Majidi, Shadie S; Holz, Stephanie; Lin, Chen
2014-09-01
The purpose of this article is to evaluate and compare the artifacts caused by metal implants in breast MR images acquired with dual-echo Dixon and two conventional fat-suppression techniques. Two types of biopsy markers were embedded into a uniform fat-water emulsion. T1-weighted gradient-echo images were acquired on a clinical 3-T MRI scanner with three different fat-suppression techniques-conventional or quick fat saturation, spectrally selective adiabatic inversion recovery (SPAIR), and dual-echo Dixon-and the 3D volumes of artifacts were measured. Among the subjects of a clinical breast MRI study using the same scanner, five patients were found to have one or more metal implants. The artifacts in Dixon and SPAIR fat-suppressed images were evaluated by three radiologists, and the results were compared with those of the phantom study. In the phantom study, the artifacts appeared as interleaved bright and dark rings on SPAIR and quick-fat-saturation images, whereas they appeared as dark regions with a thin bright rim on Dixon images. The artifacts imaged with the Dixon technique had the smallest total volume. However, the reviewers found larger artifact diameters on patient images using the Dixon sequence because only the central region was recognized as an artifact on the SPAIR images. Metal implants introduce artifacts of different types and sizes, according to the different fat-suppression techniques used. The dual-echo Dixon technique produces a larger central void, allowing the implant to be easily identified, but presents a smaller overall artifact volume by obscuring less area in the image, according to a quantitative phantom study.
Physiological artifacts in scalp EEG and ear-EEG.
Kappel, Simon L; Looney, David; Mandic, Danilo P; Kidmose, Preben
2017-08-11
A problem inherent to recording EEG is the interference arising from noise and artifacts. While in a laboratory environment, artifacts and interference can, to a large extent, be avoided or controlled, in real-life scenarios this is a challenge. Ear-EEG is a concept where EEG is acquired from electrodes in the ear. We present a characterization of physiological artifacts generated in a controlled environment for nine subjects. The influence of the artifacts was quantified in terms of the signal-to-noise ratio (SNR) deterioration of the auditory steady-state response. Alpha band modulation was also studied in an open/closed eyes paradigm. Artifacts related to jaw muscle contractions were present all over the scalp and in the ear, with the highest SNR deteriorations in the gamma band. The SNR deterioration for jaw artifacts were in general higher in the ear compared to the scalp. Whereas eye-blinking did not influence the SNR in the ear, it was significant for all groups of scalps electrodes in the delta and theta bands. Eye movements resulted in statistical significant SNR deterioration in both frontal, temporal and ear electrodes. Recordings of alpha band modulation showed increased power and coherence of the EEG for ear and scalp electrodes in the closed-eyes periods. Ear-EEG is a method developed for unobtrusive and discreet recording over long periods of time and in real-life environments. This study investigated the influence of the most important types of physiological artifacts, and demonstrated that spontaneous activity, in terms of alpha band oscillations, could be recorded from the ear-EEG platform. In its present form ear-EEG was more prone to jaw related artifacts and less prone to eye-blinking artifacts compared to state-of-the-art scalp based systems.
Cha, Jihoon; Kim, Hyung-Jin; Kim, Sung Tae; Kim, Yi Kyung; Kim, Ha Youn; Park, Gyeong Min
2017-11-01
Background Metallic dental prostheses may degrade image quality on head and neck computed tomography (CT). However, there is little information available on the use of dual-energy CT (DECT) and metal artifact reduction software (MARS) in the head and neck regions to reduce metallic dental artifacts. Purpose To assess the usefulness of DECT with virtual monochromatic imaging and MARS to reduce metallic dental artifacts. Material and Methods DECT was performed using fast kilovoltage (kV)-switching between 80-kV and 140-kV in 20 patients with metallic dental prostheses. CT data were reconstructed with and without MARS, and with synthesized monochromatic energy in the range of 40-140-kiloelectron volt (keV). For quantitative analysis, the artifact index of the tongue, buccal, and parotid areas was calculated for each scan. For qualitative analysis, two radiologists evaluated 70-keV and 100-keV images with and without MARS for tongue, buccal, parotid areas, and metallic denture. The locations and characteristics of the MARS-related artifacts, if any, were also recorded. Results DECT with MARS markedly reduced metallic dental artifacts and improved image quality in the buccal area ( P < 0.001) and the tongue ( P < 0.001), but not in the parotid area. The margin and internal architecture of the metallic dentures were more clearly delineated with MARS ( P < 0.001) and in the higher-energy images than in the lower-energy images ( P = 0.042). MARS-related artifacts most commonly occurred in the deep center of the neck. Conclusion DECT with MARS can reduce metallic dental artifacts and improve delineation of the metallic prosthesis and periprosthetic region.
Kasten, Florian H; Negahbani, Ehsan; Fröhlich, Flavio; Herrmann, Christoph S
2018-05-31
Amplitude modulated transcranial alternating current stimulation (AM-tACS) has been recently proposed as a possible solution to overcome the pronounced stimulation artifact encountered when recording brain activity during tACS. In theory, AM-tACS does not entail power at its modulating frequency, thus avoiding the problem of spectral overlap between brain signal of interest and stimulation artifact. However, the current study demonstrates how weak non-linear transfer characteristics inherent to stimulation and recording hardware can reintroduce spurious artifacts at the modulation frequency. The input-output transfer functions (TFs) of different stimulation setups were measured. Setups included recordings of signal-generator and stimulator outputs and M/EEG phantom measurements. 6 th -degree polynomial regression models were fitted to model the input-output TFs of each setup. The resulting TF models were applied to digitally generated AM-tACS signals to predict the frequency of spurious artifacts in the spectrum. All four setups measured for the study exhibited low-frequency artifacts at the modulation frequency and its harmonics when recording AM-tACS. Fitted TF models showed non-linear contributions significantly different from zero (all p < .05) and successfully predicted the frequency of artifacts observed in AM-signal recordings. Results suggest that even weak non-linearities of stimulation and recording hardware can lead to spurious artifacts at the modulation frequency and its harmonics. These artifacts were substantially larger than alpha-oscillations of a human subject in the MEG. Findings emphasize the need for more linear stimulation devices for AM-tACS and careful analysis procedures, taking into account low-frequency artifacts to avoid confusion with effects of AM-tACS on the brain. Copyright © 2018 Elsevier Inc. All rights reserved.
Vashaee, S; Goora, F; Britton, M M; Newling, B; Balcom, B J
2015-01-01
Magnetic resonance imaging (MRI) in the presence of metallic structures is very common in medical and non-medical fields. Metallic structures cause MRI image distortions by three mechanisms: (1) static field distortion through magnetic susceptibility mismatch, (2) eddy currents induced by switched magnetic field gradients and (3) radio frequency (RF) induced eddy currents. Single point ramped imaging with T1 enhancement (SPRITE) MRI measurements are largely immune to susceptibility and gradient induced eddy current artifacts. As a result, one can isolate the effects of metal objects on the RF field. The RF field affects both the excitation and detection of the magnetic resonance (MR) signal. This is challenging with conventional MRI methods, which cannot readily separate the three effects. RF induced MRI artifacts were investigated experimentally at 2.4 T by analyzing image distortions surrounding two geometrically identical metallic strips of aluminum and lead. The strips were immersed in agar gel doped with contrast agent and imaged employing the conical SPRITE sequence. B1 mapping with pure phase encode SPRITE was employed to measure the B1 field around the strips of metal. The strip geometry was chosen to mimic metal electrodes employed in electrochemistry studies. Simulations are employed to investigate the RF field induced eddy currents in the two metallic strips. The RF simulation results are in good agreement with experimental results. Experimental and simulation results show that the metal has a pronounced effect on the B1 distribution and B1 amplitude in the surrounding space. The electrical conductivity of the metal has a minimal effect. Copyright © 2014 Elsevier Inc. All rights reserved.
Ko, Hoon; Jeong, Kwanmoon; Lee, Chang-Hoon; Jun, Hong Young; Jeong, Changwon; Lee, Myeung Su; Nam, Yunyoung; Yoon, Kwon-Ha; Lee, Jinseok
2016-01-01
Image artifacts affect the quality of medical images and may obscure anatomic structure and pathology. Numerous methods for suppression and correction of scattered image artifacts have been suggested in the past three decades. In this paper, we assessed the feasibility of use of information on scattered artifacts for estimation of bone mineral density (BMD) without dual-energy X-ray absorptiometry (DXA) or quantitative computed tomographic imaging (QCT). To investigate the relationship between scattered image artifacts and BMD, we first used a forearm phantom and cone-beam computed tomography. In the phantom, we considered two regions of interest-bone-equivalent solid material containing 50 mg HA per cm(-3) and water-to represent low- and high-density trabecular bone, respectively. We compared the scattered image artifacts in the high-density material with those in the low-density material. The technique was then applied to osteoporosis patients and healthy subjects to assess its feasibility for BMD estimation. The high-density material produced a greater number of scattered image artifacts than the low-density material. Moreover, the radius and ulna of healthy subjects produced a greater number of scattered image artifacts than those from osteoporosis patients. Although other parameters, such as bone thickness and X-ray incidence, should be considered, our technique facilitated BMD estimation directly without DXA or QCT. We believe that BMD estimation based on assessment of scattered image artifacts may benefit the prevention, early treatment and management of osteoporosis.
Spectral CT metal artifact reduction with an optimization-based reconstruction algorithm
NASA Astrophysics Data System (ADS)
Gilat Schmidt, Taly; Barber, Rina F.; Sidky, Emil Y.
2017-03-01
Metal objects cause artifacts in computed tomography (CT) images. This work investigated the feasibility of a spectral CT method to reduce metal artifacts. Spectral CT acquisition combined with optimization-based reconstruction is proposed to reduce artifacts by modeling the physical effects that cause metal artifacts and by providing the flexibility to selectively remove corrupted spectral measurements in the spectral-sinogram space. The proposed Constrained `One-Step' Spectral CT Image Reconstruction (cOSSCIR) algorithm directly estimates the basis material maps while enforcing convex constraints. The incorporation of constraints on the reconstructed basis material maps is expected to mitigate undersampling effects that occur when corrupted data is excluded from reconstruction. The feasibility of the cOSSCIR algorithm to reduce metal artifacts was investigated through simulations of a pelvis phantom. The cOSSCIR algorithm was investigated with and without the use of a third basis material representing metal. The effects of excluding data corrupted by metal were also investigated. The results demonstrated that the proposed cOSSCIR algorithm reduced metal artifacts and improved CT number accuracy. For example, CT number error in a bright shading artifact region was reduced from 403 HU in the reference filtered backprojection reconstruction to 33 HU using the proposed algorithm in simulation. In the dark shading regions, the error was reduced from 1141 HU to 25 HU. Of the investigated approaches, decomposing the data into three basis material maps and excluding the corrupted data demonstrated the greatest reduction in metal artifacts.
Metal artifact reduction in tomosynthesis imaging
NASA Astrophysics Data System (ADS)
Zhang, Zhaoxia; Yan, Ming; Tao, Kun; Xuan, Xiao; Sabol, John M.; Lai, Hao
2015-03-01
The utility of digital tomosynthesis has been shown for many clinical scenarios including post orthopedic surgery applications. However, two kinds of metal artifacts can influence diagnosis: undershooting and ripple. In this paper, we describe a novel metal artifact reduction (MAR) algorithm to reduce both of these artifacts within the filtered backprojection framework. First, metal areas that are prone to cause artifacts are identified in the raw projection images. These areas are filled with values similar to those in the local neighborhood. During the filtering step, the filled projection is free of undershooting due to the resulting smooth transition near the metal edge. Finally, the filled area is fused with the filtered raw projection data to recover the metal. Since the metal areas are recognized during the back projection step, anatomy and metal can be distinguished - reducing ripple artifacts. Phantom and clinical experiments were designed to quantitatively and qualitatively evaluate the algorithms. Based on phantom images with and without metal implants, the Artifact Spread Function (ASF) was used to quantify image quality in the ripple artifact area. The tail of the ASF with MAR decreases from in-plane to out-of-plane, implying a good artifact reduction, while the ASF without MAR remains high over a wider range. An intensity plot was utilized to analyze the edge of undershooting areas. The results illustrate that MAR reduces undershooting while preserving the edge and size of the metal. Clinical images evaluated by physicists and technologists agree with these quantitative results to further demonstrate the algorithm's effectiveness.
Virtual museum of Japanese Buddhist temple features for intercultural communication
NASA Astrophysics Data System (ADS)
Kawai, Takashi; Takao, Hidenobu; Inoue, Tetsuri; Miyamoto, Hiroyuki; Noro, Kageyu
1998-04-01
This paper describes the production and presentation of an experimental virtual museum of Japanese Buddhist art. This medium can provide an easy way to introduce a cultural heritage to people of different cultures. The virtual museum consisted of a multimedia program that included stereoscopic 3D movies of Buddhist statues; binaural 3D sounds of Buddhist ceremonies and the fragrance of incense from the Buddhist temple. The aim was to reproduce both the Buddhist artifacts and atmosphere as realistically as possible.
Experimental investigations of recent anomalous results in superconductivity
NASA Astrophysics Data System (ADS)
Souw, Victor K.
2000-12-01
This thesis examines three recent anomalous results associated with irreversibility in type-II superconductivity: (1) The magnetic properties of the predicted superconductors LiBeH3 and Li2BeH 4, (2) the paramagnetic transition near T = Tc in Nb, and (3) a noise transition in a YBa2Cu3O7-delta thin film near the vortex-solid transition. The investigation of Li 2BeH4 and LiBeH3 was prompted by theoretical predictions of room-temperature superconductivity for Li2BeH4 and LiBeH3 and a recent report that Li2BeH4 showed magnetic irreversibilities similar to those of type-II superconductors. A modified experimental method is introduced in order to avoid artifacts due to background signals. The resulting data is suggestive of a superparamagnetic impurity from one of the reagents used in the synthesis and after subtracting this contribution, the temperature-dependent susceptibilities of Li2 BeH4 and LiBeH3 are estimated. However, no magnetic irreversibility suggestive of superconductivity is observed. The anomalous paramagnetic transition in Nb is intriguing because Nb does not share the d-wave order parameter symmetry often invoked to explain the phenomenon in other superconductors. A modified experimental method was developed in order to avoid instrumental artifacts known to produce a similar apparently paramagnetic response, but the results of this method indicate that the paramagnetic response is a physical property of the sample. Finally, a very sharp noise transition in a YBa2Cu3O7-delta thin film was found to be distinct from previously reported features in the voltage noise commonly associated with vortex fluctuations near the irreversibility line. In each of these three cases the examination of experimental techniques is an integral part of the investigation of novel vortex behavior near the onset of irreversibility.
Hartmann, Cornelia; Dosen, Strahinja; Amsuess, Sebastian; Farina, Dario
2015-09-01
Electrocutaneous stimulation is a promising approach to provide sensory feedback to amputees, and thus close the loop in upper limb prosthetic systems. However, the stimulation introduces artifacts in the recorded electromyographic (EMG) signals, which may be detrimental for the control of myoelectric prostheses. In this study, artifact blanking with three data segmentation approaches was investigated as a simple method to restore the performance of pattern recognition in prosthesis control (eight motions) when EMG signals are corrupted by stimulation artifacts. The methods were tested over a range of stimulation conditions and using four feature sets, comprising both time and frequency domain features. The results demonstrated that when stimulation artifacts were present, the classification performance improved with blanking in all tested conditions. In some cases, the classification performance with blanking was at the level of the benchmark (artifact-free data). The greatest pulse duration and frequency that allowed a full performance recovery were 400 μs and 150 Hz, respectively. These results show that artifact blanking can be used as a practical solution to eliminate the negative influence of the stimulation artifact on EMG pattern classification in a broad range of conditions, thus allowing to close the loop in myoelectric prostheses using electrotactile feedback.
Khan, Hassan Aqeel; Gore, Amit; Ashe, Jeff; Chakrabartty, Shantanu
2017-07-01
Physical activities are known to introduce motion artifacts in electrical impedance plethysmographic (EIP) sensors. Existing literature considers motion artifacts as a nuisance and generally discards the artifact containing portion of the sensor output. This paper examines the notion of exploiting motion artifacts for detecting the underlying physical activities which give rise to the artifacts in question. In particular, we investigate whether the artifact pattern associated with a physical activity is unique; and does it vary from one human-subject to another? Data was recorded from 19 adult human-subjects while conducting 5 distinct, artifact inducing, activities. A set of novel features based on the time-frequency signatures of the sensor outputs are then constructed. Our analysis demonstrates that these features enable high accuracy detection of the underlying physical activity. Using an SVM classifier we are able to differentiate between 5 distinct physical activities (coughing, reaching, walking, eating and rolling-on-bed) with an average accuracy of 85.46%. Classification is performed solely using features designed specifically to capture the time-frequency signatures of different physical activities. This enables us to measure both respiratory and motion information using only one type of sensor. This is in contrast to conventional approaches to physical activity monitoring; which rely on additional hardware such as accelerometers to capture activity information.
Robust artifactual independent component classification for BCI practitioners.
Winkler, Irene; Brandl, Stephanie; Horn, Franziska; Waldburger, Eric; Allefeld, Carsten; Tangermann, Michael
2014-06-01
EEG artifacts of non-neural origin can be separated from neural signals by independent component analysis (ICA). It is unclear (1) how robustly recently proposed artifact classifiers transfer to novel users, novel paradigms or changed electrode setups, and (2) how artifact cleaning by a machine learning classifier impacts the performance of brain-computer interfaces (BCIs). Addressing (1), the robustness of different strategies with respect to the transfer between paradigms and electrode setups of a recently proposed classifier is investigated on offline data from 35 users and 3 EEG paradigms, which contain 6303 expert-labeled components from two ICA and preprocessing variants. Addressing (2), the effect of artifact removal on single-trial BCI classification is estimated on BCI trials from 101 users and 3 paradigms. We show that (1) the proposed artifact classifier generalizes to completely different EEG paradigms. To obtain similar results under massively reduced electrode setups, a proposed novel strategy improves artifact classification. Addressing (2), ICA artifact cleaning has little influence on average BCI performance when analyzed by state-of-the-art BCI methods. When slow motor-related features are exploited, performance varies strongly between individuals, as artifacts may obstruct relevant neural activity or are inadvertently used for BCI control. Robustness of the proposed strategies can be reproduced by EEG practitioners as the method is made available as an EEGLAB plug-in.
Quantitative analysis of titanium-induced artifacts and correlated factors during micro-CT scanning.
Li, Jun Yuan; Pow, Edmond Ho Nang; Zheng, Li Wu; Ma, Li; Kwong, Dora Lai Wan; Cheung, Lim Kwong
2014-04-01
To investigate the impact of cover screw, resin embedment, and implant angulation on artifact of microcomputed tomography (micro-CT) scanning for implant. A total of twelve implants were randomly divided into 4 groups: (i) implant only; (ii) implant with cover screw; (iii) implant with resin embedment; and (iv) implants with cover screw and resin embedment. Implants angulation at 0°, 45°, and 90° were scanned by micro-CT. Images were assessed, and the ratio of artifact volume to total volume (AV/TV) was calculated. A multiple regression analysis in stepwise model was used to determine the significance of different factors. One-way ANOVA was performed to identify which combination of factors could minimize the artifact. In the regression analysis, implant angulation was identified as the best predictor for artifact among the factors (P < 0.001). Resin embedment also had significant effect on artifact volume (P = 0.028), while cover screw had not (P > 0.05). Non-embedded implants with the axis parallel to X-ray source of micro-CT produced minimal artifact. Implant angulation and resin embedment affected the artifact volume of micro-CT scanning for implant, while cover screw did not. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Gelman, Susan A.
2013-01-01
Psychological essentialism is an intuitive folk belief positing that certain categories have a non-obvious inner “essence” that gives rise to observable features. Although this belief most commonly characterizes natural kind categories, I argue that psychological essentialism can also be extended in important ways to artifact concepts. Specifically, concepts of individual artifacts include the non-obvious feature of object history, which is evident when making judgments regarding authenticity and ownership. Classic examples include famous works of art (e.g., the Mona Lisa is authentic because of its provenance), but ordinary artifacts likewise receive value from their history (e.g., a worn and tattered blanket may have special value if it was one's childhood possession). Moreover, in some cases, object history may be thought to have causal effects on individual artifacts, much as an animal essence has causal effects. I review empirical support for these claims and consider the implications for both artifact concepts and essentialism. This perspective suggests that artifact concepts cannot be contained in a theoretical framework that focuses exclusively on similarity or even function. Furthermore, although there are significant differences between essentialism of natural kinds and essentialism of artifact individuals, the commonalities suggest that psychological essentialism may not derive from folk biology but instead may reflect more domain-general perspectives on the world. PMID:23976903
Motion artifact detection in four-dimensional computed tomography images
NASA Astrophysics Data System (ADS)
Bouilhol, G.; Ayadi, M.; Pinho, R.; Rit, S.; Sarrut, D.
2014-03-01
Motion artifacts appear in four-dimensional computed tomography (4DCT) images because of suboptimal acquisition parameters or patient breathing irregularities. Frequency of motion artifacts is high and they may introduce errors in radiation therapy treatment planning. Motion artifact detection can be useful for image quality assessment and 4D reconstruction improvement but manual detection in many images is a tedious process. We propose a novel method to evaluate the quality of 4DCT images by automatic detection of motion artifacts. The method was used to evaluate the impact of the optimization of acquisition parameters on image quality at our institute. 4DCT images of 114 lung cancer patients were analyzed. Acquisitions were performed with a rotation period of 0.5 seconds and a pitch of 0.1 (74 patients) or 0.081 (40 patients). A sensitivity of 0.70 and a specificity of 0.97 were observed. End-exhale phases were less prone to motion artifacts. In phases where motion speed is high, the number of detected artifacts was systematically reduced with a pitch of 0.081 instead of 0.1 and the mean reduction was 0.79. The increase of the number of patients with no artifact detected was statistically significant for the 10%, 70% and 80% respiratory phases, indicating a substantial image quality improvement.
Hublin, Jean-Jacques; Talamo, Sahra; Julien, Michèle; David, Francine; Connet, Nelly; Bodu, Pierre; Vandermeersch, Bernard; Richards, Michael P.
2012-01-01
The transition from the Middle Paleolithic (MP) to Upper Paleolithic (UP) is marked by the replacement of late Neandertals by modern humans in Europe between 50,000 and 40,000 y ago. Châtelperronian (CP) artifact assemblages found in central France and northern Spain date to this time period. So far, it is the only such assemblage type that has yielded Neandertal remains directly associated with UP style artifacts. CP assemblages also include body ornaments, otherwise virtually unknown in the Neandertal world. However, it has been argued that instead of the CP being manufactured by Neandertals, site formation processes and layer admixture resulted in the chance association of Neanderthal remains, CP assemblages, and body ornaments. Here, we report a series of accelerator mass spectrometry radiocarbon dates on ultrafiltered bone collagen extracted from 40 well-preserved bone fragments from the late Mousterian, CP, and Protoaurignacian layers at the Grotte du Renne site (at Arcy-sur-Cure, France). Our radiocarbon results are inconsistent with the admixture hypothesis. Further, we report a direct date on the Neandertal CP skeleton from Saint-Césaire (France). This date corroborates the assignment of CP assemblages to the latest Neandertals of western Europe. Importantly, our results establish that the production of body ornaments in the CP postdates the arrival of modern humans in neighboring regions of Europe. This new behavior could therefore have been the result of cultural diffusion from modern to Neandertal groups. PMID:23112183
Hagmann, Cornelia Franziska; Robertson, Nicola Jayne; Azzopardi, Denis
2006-12-01
This is a case report and a descriptive study demonstrating that artifacts are common during long-term recording of amplitude-integrated electroencephalograms and may lead to erroneous classification of the amplitude-integrated electroencephalogram trace. Artifacts occurred in 12% of 200 hours of recording time sampled from a representative sample of 20 infants with neonatal encephalopathy. Artifacts derived from electrical or movement interference occurred with similar frequency; both types of artifacts influenced the voltage and width of the amplitude-integrated electroencephalogram band. This is important knowledge especially if amplitude-integrated electroencephalogram is used as a selection tool for neuroprotection intervention studies.
Detecting stripe artifacts in ultrasound images.
Maciak, Adam; Kier, Christian; Seidel, Günter; Meyer-Wiethe, Karsten; Hofmann, Ulrich G
2009-10-01
Brain perfusion diseases such as acute ischemic stroke are detectable through computed tomography (CT)-/magnetic resonance imaging (MRI)-based methods. An alternative approach makes use of ultrasound imaging. In this low-cost bedside method, noise and artifacts degrade the imaging process. Especially stripe artifacts show a similar signal behavior compared to acute stroke or brain perfusion diseases. This document describes how stripe artifacts can be detected and eliminated in ultrasound images obtained through harmonic imaging (HI). On the basis of this new method, both proper identification of areas with critically reduced brain tissue perfusion and classification between brain perfusion defects and ultrasound stripe artifacts are made possible.
A simple system for detection of EEG artifacts in polysomnographic recordings.
Durka, P J; Klekowicz, H; Blinowska, K J; Szelenberger, W; Niemcewicz, Sz
2003-04-01
We present an efficient parametric system for automatic detection of electroencephalogram (EEG) artifacts in polysomnographic recordings. For each of the selected types of artifacts, a relevant parameter was calculated for a given epoch. If any of these parameters exceeded a threshold, the epoch was marked as an artifact. Performance of the system, evaluated on 18 overnight polysomnographic recordings, revealed concordance with decisions of human experts close to the interexpert agreement and the repeatability of expert's decisions, assessed via a double-blind test. Complete software (Matlab source code) for the presented system is freely available from the Internet at http://brain.fuw.edu.pl/artifacts.
Color holography for museums: bringing the artifacts back to the people
NASA Astrophysics Data System (ADS)
Bjelkhagen, Hans I.; Osanlou, Ardie
2011-02-01
Color display holography, which is the most accurate imaging technology known to science, has been used to produce holographic images for display of artifacts in museums. This article presents the 'Bringing the Artifacts back to the people' project. Holograms of twelve different artifacts were recorded using the single-beam Denisyuk color reflection hologram technique. 'White' laser light was produced from three combined cw RGB lasers: a red krypton-ion laser, a green frequency-doubled Nd-YAG laser, and an argon-ion laser. Panchromatic ultra-fine-grain silver halide materials were used for the recording of the holograms. During 2009 the artifacts were brought to St Asaph in Wales at the Centre for Modern Optics, to undergo holographic recording. One of the recorded artifacts included a 14,000-year-old decorated horse jaw bone from the ice age, which is kept at British Museum in London. The recorded color holograms of this object and others have been arranged in a touring exhibition, the 'Virtual Artifacts Exhibition.' During 2010- 2011, this will be installed in a number of local museums in North Wales and surrounding areas.
Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features.
Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate
2017-08-01
Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.
Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features
NASA Astrophysics Data System (ADS)
Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate
2017-08-01
Objective. Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. Approach. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. Main results. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Significance. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.
Dual energy CT: How well can pseudo-monochromatic imaging reduce metal artifacts?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuchenbecker, Stefan, E-mail: stefan.kuchenbecker@dkfz.de; Faby, Sebastian; Sawall, Stefan
2015-02-15
Purpose: Dual Energy CT (DECT) provides so-called monoenergetic images based on a linear combination of the original polychromatic images. At certain patient-specific energy levels, corresponding to certain patient- and slice-dependent linear combination weights, e.g., E = 160 keV corresponds to α = 1.57, a significant reduction of metal artifacts may be observed. The authors aimed at analyzing the method for its artifact reduction capabilities to identify its limitations. The results are compared with raw data-based processing. Methods: Clinical DECT uses a simplified version of monochromatic imaging by linearly combining the low and the high kV images and by assigning an energymore » to that linear combination. Those pseudo-monochromatic images can be used by radiologists to obtain images with reduced metal artifacts. The authors analyzed the underlying physics and carried out a series expansion of the polychromatic attenuation equations. The resulting nonlinear terms are responsible for the artifacts, but they are not linearly related between the low and the high kV scan: A linear combination of both images cannot eliminate the nonlinearities, it can only reduce their impact. Scattered radiation yields additional noncanceling nonlinearities. This method is compared to raw data-based artifact correction methods. To quantify the artifact reduction potential of pseudo-monochromatic images, they simulated the FORBILD abdomen phantom with metal implants, and they assessed patient data sets of a clinical dual source CT system (100, 140 kV Sn) containing artifacts induced by a highly concentrated contrast agent bolus and by metal. In each case, they manually selected an optimal α and compared it to a raw data-based material decomposition in case of simulation, to raw data-based material decomposition of inconsistent rays in case of the patient data set containing contrast agent, and to the frequency split normalized metal artifact reduction in case of the metal implant. For each case, the contrast-to-noise ratio (CNR) was assessed. Results: In the simulation, the pseudo-monochromatic images yielded acceptable artifact reduction results. However, the CNR in the artifact-reduced images was more than 60% lower than in the original polychromatic images. In contrast, the raw data-based material decomposition did not significantly reduce the CNR in the virtual monochromatic images. Regarding the patient data with beam hardening artifacts and with metal artifacts from small implants the pseudo-monochromatic method was able to reduce the artifacts, again with the downside of a significant CNR reduction. More intense metal artifacts, e.g., as those caused by an artificial hip joint, could not be suppressed. Conclusions: Pseudo-monochromatic imaging is able to reduce beam hardening, scatter, and metal artifacts in some cases but it cannot remove them. In all cases, the CNR is significantly reduced, thereby rendering the method questionable, unless special post processing algorithms are implemented to restore the high CNR from the original images (e.g., by using a frequency split technique). Raw data-based dual energy decomposition methods should be preferred, in particular, because the CNR penalty is almost negligible.« less
1 H NMR study and multivariate data analysis of reindeer skin tanning methods.
Zhu, Lizheng; Ilott, Andrew J; Del Federico, Eleonora; Kehlet, Cindie; Klokkernes, Torunn; Jerschow, Alexej
2017-04-01
Reindeer skin clothing has been an essential component in the lives of indigenous people of the arctic and sub-arctic regions, keeping them warm during harsh winters. However, the skin processing technology, which often conveys the history and tradition of the indigenous group, has not been well documented. In this study, NMR spectra and relaxation behaviors of reindeer skin samples treated with a variety of vegetable tannin extracts, oils and fatty substances are studied and compared. With the assistance of principal component analysis (PCA), one can recognize patterns and identify groupings of differently treated samples. These methods could be important aids in efforts to conserve museum leather artifacts with unknown treatment methods and in the analysis of reindeer skin tanning processes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Can healthy, young adults uncover personal details of unknown target individuals in their dreams?
Smith, Carlyle
2013-01-01
We investigated the possibility that undergraduate college students could incubate dreams containing information about unknown target individuals with significant life problems. In Experiment 1, students provided two baseline dreams. They were then exposed to a photo of an individual and invited to dream about a health problem (unknown to them and the experimenter) of that individual and asked to provide two more dreams. From a class of 65 students, 12 dreamers volunteered dreams about the unknown target. In Experiment 2, 66 students were asked to dream about the life problems of a second individual, simply by looking at the photo (experimental group). Another 56 students were exposed to this same paradigm, but the photo that they examined was computer generated and the target individual was fictitious (control group). The dream elements were objectively scored with categories devised using the Hall-Van de Castle system as a model. Data were ordinal, and the nonparametric Wilcoxon signed rank test was used to examine preincubation (baseline) versus postincubation (photo examination and incubation) dream content in Experiment 1. In Experiment 2, a Z score for proportions was used to compare differences in frequency of devised categories between experimental and control groups. In Experiment 1, the comparison of postincubation dreams (all categories combined) was significant compared with the preincubation dreams (Z = 2.09, P = .036). The postincubation dreams reflected the health problem of the target. In Experiment 2, the proportion of scored categories in experimental and control groups were compared at the preincubation and postincubation conditions. The proportions of "Combined" (all categories) was very significantly larger at the postincubation condition (Z = 6.27, P < .00001). The groups did not differ at the preincubation condition (Z = -1.12, not significant). Individual postincubation condition comparisons of the experimental versus control groups revealed significant differences in three of the devised scoring categories, ranging from P < .002 to P < .05. There were no experimental versus control preincubation differences. The postincubation dreams of the experimental group were related to the problems of the target individual. Young, healthy adults are capable of dreaming details about the personal problems of an unknown individual simply by examining a picture of the target and then planning to dream about that individual's problems. Copyright © 2013 Elsevier Inc. All rights reserved.
Learning in Home Care: A Digital Artifact as a Designated Boundary Object-in-Use
ERIC Educational Resources Information Center
Islind, Anna Sigridur; Lundh Snis, Ulrika
2017-01-01
Purpose: The aim of this paper is to understand how the role of an mHealth artifact plays out in home care settings. An mHealth artifact, in terms of a mobile app, was tested to see how the quality of home care work practice was enhanced and changed. The research question is: In what ways does an mHealth artifact re-shape a home care practice and…
ERIC Educational Resources Information Center
Spektor-Precel, Karen; Mioduser, David
2015-01-01
Nowadays, we are surrounded by artifacts that are capable of adaptive behavior, such as electric pots, boiler timers, automatic doors, and robots. The literature concerning human beings' conceptions of "traditional" artifacts is vast, however, little is known about our conceptions of behaving artifacts, nor of the influence of the…
Utility of CT-compatible EEG electrodes in critically ill children.
Abend, Nicholas S; Dlugos, Dennis J; Zhu, Xiaowei; Schwartz, Erin S
2015-04-01
Electroencephalographic monitoring is being used with increasing frequency in critically ill children who may require frequent and sometimes urgent brain CT scans. Standard metallic disk EEG electrodes commonly produce substantial imaging artifact, and they must be removed and later reapplied when CT scans are indicated. To determine whether conductive plastic electrodes caused artifact that limited CT interpretation. We describe a retrospective cohort of 13 consecutive critically ill children who underwent 17 CT scans with conductive plastic electrodes during 1 year. CT images were evaluated by a pediatric neuroradiologist for artifact presence, type and severity. All CT scans had excellent quality images without artifact that impaired CT interpretation except for one scan in which improper wire placement resulted in artifact. Conductive plastic electrodes do not cause artifact limiting CT scan interpretation and may be used in critically ill children to permit concurrent electroencephalographic monitoring and CT imaging.
Automatic correction of dental artifacts in PET/MRI
Ladefoged, Claes N.; Andersen, Flemming L.; Keller, Sune. H.; Beyer, Thomas; Law, Ian; Højgaard, Liselotte; Darkner, Sune; Lauze, Francois
2015-01-01
Abstract. A challenge when using current magnetic resonance (MR)-based attenuation correction in positron emission tomography/MR imaging (PET/MRI) is that the MRIs can have a signal void around the dental fillings that is segmented as artificial air-regions in the attenuation map. For artifacts connected to the background, we propose an extension to an existing active contour algorithm to delineate the outer contour using the nonattenuation corrected PET image and the original attenuation map. We propose a combination of two different methods for differentiating the artifacts within the body from the anatomical air-regions by first using a template of artifact regions, and second, representing the artifact regions with a combination of active shape models and k-nearest-neighbors. The accuracy of the combined method has been evaluated using 25 F18-fluorodeoxyglucose PET/MR patients. Results showed that the approach was able to correct an average of 97±3% of the artifact areas. PMID:26158104
MR Image Based Approach for Metal Artifact Reduction in X-Ray CT
2013-01-01
For decades, computed tomography (CT) images have been widely used to discover valuable anatomical information. Metallic implants such as dental fillings cause severe streaking artifacts which significantly degrade the quality of CT images. In this paper, we propose a new method for metal-artifact reduction using complementary magnetic resonance (MR) images. The method exploits the possibilities which arise from the use of emergent trimodality systems. The proposed algorithm corrects reconstructed CT images. The projected data which is affected by dental fillings is detected and the missing projections are replaced with data obtained from a corresponding MR image. A simulation study was conducted in order to compare the reconstructed images with images reconstructed through linear interpolation, which is a common metal-artifact reduction technique. The results show that the proposed method is successful in reducing severe metal artifacts without introducing significant amount of secondary artifacts. PMID:24302860
On the influence of high-pass filtering on ICA-based artifact reduction in EEG-ERP.
Winkler, Irene; Debener, Stefan; Müller, Klaus-Robert; Tangermann, Michael
2015-01-01
Standard artifact removal methods for electroencephalographic (EEG) signals are either based on Independent Component Analysis (ICA) or they regress out ocular activity measured at electrooculogram (EOG) channels. Successful ICA-based artifact reduction relies on suitable pre-processing. Here we systematically evaluate the effects of high-pass filtering at different frequencies. Offline analyses were based on event-related potential data from 21 participants performing a standard auditory oddball task and an automatic artifactual component classifier method (MARA). As a pre-processing step for ICA, high-pass filtering between 1-2 Hz consistently produced good results in terms of signal-to-noise ratio (SNR), single-trial classification accuracy and the percentage of `near-dipolar' ICA components. Relative to no artifact reduction, ICA-based artifact removal significantly improved SNR and classification accuracy. This was not the case for a regression-based approach to remove EOG artifacts.
Mining high-throughput experimental data to link gene and function.
Blaby-Haas, Crysten E; de Crécy-Lagard, Valérie
2011-04-01
Nearly 2200 genomes that encode around 6 million proteins have now been sequenced. Around 40% of these proteins are of unknown function, even when function is loosely and minimally defined as 'belonging to a superfamily'. In addition to in silico methods, the swelling stream of high-throughput experimental data can give valuable clues for linking these unknowns with precise biological roles. The goal is to develop integrative data-mining platforms that allow the scientific community at large to access and utilize this rich source of experimental knowledge. To this end, we review recent advances in generating whole-genome experimental datasets, where this data can be accessed, and how it can be used to drive prediction of gene function. Copyright © 2011 Elsevier Ltd. All rights reserved.
Zhang, Jie; Fan, Xinghua; Graham, Lisa; Chan, Tak W; Brook, Jeffrey R
2013-01-01
Sampling of particle-phase organic carbon (OC) from diesel engines is complicated by adsorption and evaporation of semivolatile organic carbon (SVOC), defined as positive and negative artifacts, respectively. In order to explore these artifacts, an integrated organic gas and particle sampler (IOGAPS) was applied, in which an XAD-coated multichannel annular denuder was placed upstream to remove the gas-phase SVOC and two downstream sorbent-impregnated filters (SIFs) were employed to capture the evaporated SVOC. Positive artifacts can be reduced by using a denuder but particle loss also occurs. This paper investigates the IOGAPS with respect to particle loss, denuder efficiency, and particle-phase OC artifacts by comparing OC, elemental carbon (EC), SVOC, and selected organic species, as well as particle size distributions. Compared to the filterpack methods typically used, the IOGAPS approach results in estimation of both positive and negative artifacts, especially the negative artifact. The positive and negative artifacts were 190 microg/m3 and 67 microg/m3, representing 122% and 43% of the total particle OC measured by the IOGAPS, respectively. However particle loss and denuder break-through were also found to exist. Monitoring particle mass loss by particle number or EC concentration yielded similar results ranging from 10% to 24% depending upon flow rate. Using the measurements of selected particle-phase organic species to infer particle loss resulted in larger estimates, on the order of 32%. The denuder collection efficiencyfor SVOCs at 74 L/min was found to be less than 100%, with an average of 84%. In addition to these uncertainties the IOGAPS method requires a considerable amount of extra effort to apply. These disadvantages must be weighed against the benefits of being able to estimate positive artifacts and correct, with some uncertainty, for the negative artifacts when selecting a method for sampling diesel emissions. Measurements of diesel emissions are necessary to understand their adverse impacts. Much of the emissions is organic carbon covering a range ofvolatilities, complicating determination of the particle fraction because of sampling artifacts. In this paper an approach to quantify artifacts is evaluated for a diesel engine. This showed that 63% of the particle organic carbon typically measured could be the positive artifact while the negative artifact is about one-third of this value. However, this approach adds time and expense and leads to other uncertainties, implying that effort is needed to develop methods to accurately measure diesel emissions.
Reduction of metal artifacts: beam hardening and photon starvation effects
NASA Astrophysics Data System (ADS)
Yadava, Girijesh K.; Pal, Debashish; Hsieh, Jiang
2014-03-01
The presence of metal-artifacts in CT imaging can obscure relevant anatomy and interfere with disease diagnosis. The cause and occurrence of metal-artifacts are primarily due to beam hardening, scatter, partial volume and photon starvation; however, the contribution to the artifacts from each of them depends on the type of hardware. A comparison of CT images obtained with different metallic hardware in various applications, along with acquisition and reconstruction parameters, helps understand methods for reducing or overcoming such artifacts. In this work, a metal beam hardening correction (BHC) and a projection-completion based metal artifact reduction (MAR) algorithms were developed, and applied on phantom and clinical CT scans with various metallic implants. Stainless-steel and Titanium were used to model and correct for metal beam hardening effect. In the MAR algorithm, the corrupted projection samples are replaced by the combination of original projections and in-painted data obtained by forward projecting a prior image. The data included spine fixation screws, hip-implants, dental-filling, and body extremity fixations, covering range of clinically used metal implants. Comparison of BHC and MAR on different metallic implants was used to characterize dominant source of the artifacts, and conceivable methods to overcome those. Results of the study indicate that beam hardening could be a dominant source of artifact in many spine and extremity fixations, whereas dental and hip implants could be dominant source of photon starvation. The BHC algorithm could significantly improve image quality in CT scans with metallic screws, whereas MAR algorithm could alleviate artifacts in hip-implants and dentalfillings.
Kim, Kyungsoo; Punte, Andrea Kleine; Mertens, Griet; Van de Heyning, Paul; Park, Kyung-Joon; Choi, Hongsoo; Choi, Ji-Woong; Song, Jae-Jin
2015-11-30
Quantitative electroencephalography (qEEG) is effective when used to analyze ongoing cortical oscillations in cochlear implant (CI) users. However, localization of cortical activity in such users via qEEG is confounded by the presence of artifacts produced by the device itself. Typically, independent component analysis (ICA) is used to remove CI artifacts in auditory evoked EEG signals collected upon brief stimulation and it is effective for auditory evoked potentials (AEPs). However, AEPs do not reflect the daily environments of patients, and thus, continuous EEG data that are closer to such environments are desirable. In this case, device-related artifacts in EEG data are difficult to remove selectively via ICA due to over-completion of EEG data removal in the absence of preprocessing. EEGs were recorded for a long time under conditions of continuous auditory stimulation. To obviate the over-completion problem, we limited the frequency of CI artifacts to a significant characteristic peak and apply ICA artifact removal. Topographic brain mapping results analyzed via band-limited (BL)-ICA exhibited a better energy distribution, matched to the CI location, than data obtained using conventional ICA. Also, source localization data verified that BL-ICA effectively removed CI artifacts. The proposed method selectively removes CI artifacts from continuous EEG recordings, while ICA removal method shows residual peak and removes important brain activity signals. CI artifacts in EEG data obtained during continuous passive listening can be effectively removed with the aid of BL-ICA, opening up new EEG research possibilities in subjects with CIs. Copyright © 2015 Elsevier B.V. All rights reserved.
Reduced aliasing artifacts using shaking projection k-space sampling trajectory
NASA Astrophysics Data System (ADS)
Zhu, Yan-Chun; Du, Jiang; Yang, Wen-Chao; Duan, Chai-Jie; Wang, Hao-Yu; Gao, Song; Bao, Shang-Lian
2014-03-01
Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts.
Neutron activation analysis traces copper artifacts to geographical point of origin
NASA Technical Reports Server (NTRS)
Conway, M.; Fields, P.; Friedman, A.; Kastner, M.; Metta, D.; Milsted, J.; Olsen, E.
1967-01-01
Impurities remaining in the metallic copper are identified and quantified by spectrographic and neutron activation analysis. Determination of the type of ore used for the copper artifact places the geographic point of origin of the artifact.
Cömert, Alper; Hyttinen, Jari
2015-05-15
With advances in technology and increasing demand, wearable biosignal monitoring is developing and new applications are emerging. One of the main challenges facing the widespread use of wearable monitoring systems is the motion artifact. The sources of the motion artifact lie in the skin-electrode interface. Reducing the motion and deformation at this interface should have positive effects on signal quality. In this study, we aim to investigate whether the structure supporting the electrode can be designed to reduce the motion artifact with the hypothesis that this can be achieved by stabilizing the skin deformations around the electrode. We compare four textile electrodes with different support structure designs: a soft padding larger than the electrode area, a soft padding larger than the electrode area with a novel skin deformation restricting design, a soft padding the same size as the electrode area, and a rigid support the same size as the electrode. With five subjects and two electrode locations placed over different kinds of tissue at various mounting forces, we simultaneously measured the motion artifact, a motion affected ECG, and the real-time skin-electrode impedance during the application of controlled motion to the electrodes. The design of the electrode support structure has an effect on the generated motion artifact; good design with a skin stabilizing structure makes the electrodes physically more motion artifact resilient, directly affecting signal quality. Increasing the applied mounting force shows a positive effect up to 1,000 gr applied force. The properties of tissue under the electrode are an important factor in the generation of the motion artifact and the functioning of the electrodes. The relationship of motion artifact amplitude to the electrode movement magnitude is seen to be linear for smaller movements. For larger movements, the increase of motion generated a disproportionally larger artifact. The motion artifact and the induced impedance change were caused by the electrode motion and contained the same frequency components as the applied electrode motion pattern. We found that stabilizing the skin around the electrode using an electrode structure that manages to successfully distribute the force and movement to an area beyond the borders of the electrical contact area reduces the motion artifact when compared to structures that are the same size as the electrode area.
Picking Up Artifacts: Storyboarding as a Gateway to Reuse
NASA Astrophysics Data System (ADS)
Wahid, Shahtab; Branham, Stacy M.; Cairco, Lauren; McCrickard, D. Scott; Harrison, Steve
Storyboarding offers designers the opportunity to illustrate a visual narrative of use. Because designers often refer to past ideas, we argue storyboards can be constructed by reusing shared artifacts. We present a study in which we explore how designers reuse artifacts consisting of images and rationale during storyboard construction. We find images can aid in accessing rationale and that connections among features aid in deciding what to reuse, creating new artifacts, and constructing. Based on requirements derived from our findings, we present a storyboarding tool, PIC-UP, to facilitate artifact sharing and reuse and evaluate its use in an exploratory study. We conclude with remarks on facilitating reuse and future work.
μ-tempered metadynamics: Artifact independent convergence times for wide hills
NASA Astrophysics Data System (ADS)
Dickson, Bradley M.
2015-12-01
Recent analysis of well-tempered metadynamics (WTmetaD) showed that it converges without mollification artifacts in the bias potential. Here, we explore how metadynamics heals mollification artifacts, how healing impacts convergence time, and whether alternative temperings may be used to improve efficiency. We introduce "μ-tempered" metadynamics as a simple tempering scheme, inspired by a related mollified adaptive biasing potential, that results in artifact independent convergence of the free energy estimate. We use a toy model to examine the role of artifacts in WTmetaD and solvated alanine dipeptide to compare the well-tempered and μ-tempered frameworks demonstrating fast convergence for hill widths as large as 60∘ for μTmetaD.
μ-tempered metadynamics: Artifact independent convergence times for wide hills.
Dickson, Bradley M
2015-12-21
Recent analysis of well-tempered metadynamics (WTmetaD) showed that it converges without mollification artifacts in the bias potential. Here, we explore how metadynamics heals mollification artifacts, how healing impacts convergence time, and whether alternative temperings may be used to improve efficiency. We introduce "μ-tempered" metadynamics as a simple tempering scheme, inspired by a related mollified adaptive biasing potential, that results in artifact independent convergence of the free energy estimate. We use a toy model to examine the role of artifacts in WTmetaD and solvated alanine dipeptide to compare the well-tempered and μ-tempered frameworks demonstrating fast convergence for hill widths as large as 60(∘) for μTmetaD.
Wavelet-Based Motion Artifact Removal for Electrodermal Activity
Chen, Weixuan; Jaques, Natasha; Taylor, Sara; Sano, Akane; Fedor, Szymon; Picard, Rosalind W.
2017-01-01
Electrodermal activity (EDA) recording is a powerful, widely used tool for monitoring psychological or physiological arousal. However, analysis of EDA is hampered by its sensitivity to motion artifacts. We propose a method for removing motion artifacts from EDA, measured as skin conductance (SC), using a stationary wavelet transform (SWT). We modeled the wavelet coefficients as a Gaussian mixture distribution corresponding to the underlying skin conductance level (SCL) and skin conductance responses (SCRs). The goodness-of-fit of the model was validated on ambulatory SC data. We evaluated the proposed method in comparison with three previous approaches. Our method achieved a greater reduction of artifacts while retaining motion-artifact-free data. PMID:26737714
Dealing with noise and physiological artifacts in human EEG recordings: empirical mode methods
NASA Astrophysics Data System (ADS)
Runnova, Anastasiya E.; Grubov, Vadim V.; Khramova, Marina V.; Hramov, Alexander E.
2017-04-01
In the paper we propose the new method for removing noise and physiological artifacts in human EEG recordings based on empirical mode decomposition (Hilbert-Huang transform). As physiological artifacts we consider specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the proposed method with steps including empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing these empirical modes and reconstructing of initial EEG signal. We show the efficiency of the method on the example of filtration of human EEG signal from eye-moving artifacts.
Mathematical approach to recover EEG brain signals with artifacts by means of Gram-Schmidt transform
NASA Astrophysics Data System (ADS)
Runnova, A. E.; Zhuravlev, M. O.; Koronovskiy, A. A.; Hramov, A. E.
2017-04-01
A novel method for removing oculomotor artifacts on electroencephalographical signals is proposed and based on the orthogonal Gram-Schmidt transform using electrooculography data. The method has shown high efficiency removal of artifacts caused by spontaneous movements of the eyeballs (about 95-97% correct remote oculomotor artifacts). This method may be recommended for multi-channel electroencephalography data processing in an automatic on-line in a variety of psycho-physiological experiments.
NASA Astrophysics Data System (ADS)
Garcia, J.; Hidalgo, S. S.; Solis, S. E.; Vazquez, D.; Nuñez, J.; Rodriguez, A. O.
2012-10-01
The susceptibility artifacts can degrade of magnetic resonance image quality. Electrodes are an important source of artifacts when performing brain imaging. A dedicated phantom was built using a depth electrode to study the susceptibility effects under different pulse sequences. T2-weighted images were acquired with both gradient-and spin-echo sequences. The spin-echo sequences can significantly attenuate the susceptibility artifacts allowing a straightforward visualization of the regions surrounding the electrode.
EEG artifact elimination by extraction of ICA-component features using image processing algorithms.
Radüntz, T; Scouten, J; Hochmuth, O; Meffert, B
2015-03-30
Artifact rejection is a central issue when dealing with electroencephalogram recordings. Although independent component analysis (ICA) separates data in linearly independent components (IC), the classification of these components as artifact or EEG signal still requires visual inspection by experts. In this paper, we achieve automated artifact elimination using linear discriminant analysis (LDA) for classification of feature vectors extracted from ICA components via image processing algorithms. We compare the performance of this automated classifier to visual classification by experts and identify range filtering as a feature extraction method with great potential for automated IC artifact recognition (accuracy rate 88%). We obtain almost the same level of recognition performance for geometric features and local binary pattern (LBP) features. Compared to the existing automated solutions the proposed method has two main advantages: First, it does not depend on direct recording of artifact signals, which then, e.g. have to be subtracted from the contaminated EEG. Second, it is not limited to a specific number or type of artifact. In summary, the present method is an automatic, reliable, real-time capable and practical tool that reduces the time intensive manual selection of ICs for artifact removal. The results are very promising despite the relatively small channel resolution of 25 electrodes. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Morphologic analysis of artifacts in human fetal eyes confounding histopathologic investigations.
Herwig, Martina C; Müller, Annette M; Holz, Frank G; Loeffler, Karin U
2011-04-25
Human fetal eyes are an excellent source for studies of the normal ocular development and for examining early ocular changes associated with various syndromes in the context of a pediatric pathologic or prenatal sonographic diagnosis. However, artifacts caused by different factors often render an exact interpretation difficult. In this study, the frequency and extent of artifacts in human fetal eyes were investigated with the aim of distinguishing more precisely these artifacts from real findings, allowing also for a more diligent forensic interpretation. The cohort included 341 fetal eyes, ranging in age from 8 to 38 weeks of gestation, that were investigated macroscopically and by light microscopy. In most specimens, artifacts such as pigment spillage and autolytic changes of the retina were noted. Nearly all specimens showed changes of the lens with remarkable similarities to cataractous lenses in adult eyes. Structural ocular changes associated with systemic syndromes were also observed and in most instances could be distinguished from artifacts. Morphologic changes in fetal eyes should be classified in artifacts caused by way of abortion, mechanical effects from the removal of the eyes, delayed fixation with autolysis, and the fixative itself and should be distinguished from genuine structural abnormalities associated with ocular or systemic disease. This classification can be fairly difficult and requires experience. In addition, lens artifacts are often misleading, and the diagnosis of a fetal cataract should not be made based on histopathologic examination alone.
Theories and models on the biological of cells in space
NASA Technical Reports Server (NTRS)
Todd, P.; Klaus, D. M.
1996-01-01
A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.
Intraindividual comparison of image quality in MR urography at 1.5 and 3 tesla in an animal model.
Regier, M; Nolte-Ernsting, C; Adam, G; Kemper, J
2008-10-01
Experimental evaluation of image quality of the upper urinary tract in MR urography (MRU) at 1.5 and 3 Tesla in a porcine model. In this study four healthy domestic pigs, weighing between 71 and 80 kg (mean 73.6 kg), were examined with a standard T1w 3D-GRE and a high-resolution (HR) T1w 3D-GRE sequence at 1.5 and 3 Tesla. Additionally, at 3 Tesla both sequences were performed with parallel imaging (SENSE factor 2). The MR urographic scans were performed after intravenous injection of gadolinium-DTPA (0.1 mmol/kg body weight (bw)) and low-dose furosemide (0.1 mg/kg bw). Image evaluation was performed by two independent radiologists blinded to sequence parameters and field strength. Image analysis included grading of image quality of the segmented collecting system based on a five-point grading scale regarding anatomical depiction and artifacts observed (1: the majority of the segment (>50%) was not depicted or was obscured by major artifacts; 5: the segment was visualized without artifacts and had sharply defined borders). Signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were determined. Statistical analysis included kappa-statistics, Wilcoxon and paired student t-test. The mean scores for MR urographies at 1.5 Tesla were 2.83 for the 3D-GRE and 3.48 for the HR3D-GRE sequence. Significantly higher values were determined using the corresponding sequences at 3 Tesla, averaging 3.19 for the 3D-GRE (p = 0.047) and 3.92 for the HR3D-GRE (p = 0,023) sequence. Delineation of the pelvicaliceal system was rated significantly higher at 3 Tesla compared to 1.5 Tesla (3D-GRE: p = 0.015; HR3D-GRE: p = 0.006). At 3 Tesla the mean SNR and CNR were significantly higher (p < 0.05). A kappa of 0.67 indicated good interobserver agreement. In an experimental setup, MR urography at 3 Tesla allowed for significantly higher image quality and SNR compared to 1.5 Tesla, particularly for the visualization of the pelvicaliceal system.
NASA Astrophysics Data System (ADS)
Carey, Austin M.; Paige, Ginger B.; Carr, Bradley J.; Dogan, Mine
2017-10-01
Time-lapse electrical resistivity tomography (ERT) is commonly used as a minimally invasive tool to study infiltration processes. In 2014, we conducted field studies coupling variable intensity rainfall simulation with high-resolution ERT to study the real-time partitioning of rainfall into surface and subsurface response. The significant contrast in resistivity in the subsurface from large changes in subsurface moisture resulted in artifacts during the inversion process of the time-lapse ERT data collected using a dipole-dipole electrode array. These artifacts, which are not representative of real subsurface moisture dynamics, have been shown to arise during time-lapse inversion of ERT data and may be subject to misinterpretation. Forward modeling of the infiltration process post field experiments using a two-layer system (saprolite overlain by a soil layer) was used to generate synthetic datasets. The synthetic data were used to investigate the influence of both changes in volumetric moisture content and electrode configuration on the development of the artifacts identified in the field datasets. For the dipole-dipole array, we found that a decrease in the resistivity of the bottom layer by 67% resulted in a 50% reduction in artifact development. Artifacts for the seven additional array configurations tested, ranged from a 19% increase in artifact development (using an extended dipole-dipole array) to as much as a 96% decrease in artifact development (using a wenner-alpha array), compared to that of the dipole-dipole array. Moreover, these arrays varied in their ability to accurately delineate the infiltration front. Model results showed that the modified pole-dipole array was able to accurately image the infiltration zone and presented fewer artifacts for our experiments. In this study, we identify an optimal array type for imaging rainfall-infiltration dynamics that reduces artifacts. The influence of moisture contrast between the infiltrating water and the bulk subsurface material was characterized and shown to be a major factor in contributing to artifact development. Through forward modeling, this study highlights the importance of considering array type and subsurface moisture conditions when using time-lapse resistivity to obtain reliable estimates of vadose zone flow processes during rainfall-infiltration events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, S; Jacob, R; Popple, R
Purpose Fiducial-based imaging is often used in IGRT. Traditional gold fiducial marker often has substantial reconstruction artifacts. These artifacts Result in poor image quality of DRR for online kV-to-DRR matching. This study evaluated the image quality of PEEK in DRR in static and moving phantom. Methods CT scan of the Gold and PEEK fiducial (both 1×3 mm) was acquired in a 22 cm cylindrical phantom filled with water. Image artifacts was evaluated with maximum CT value deviated from water due to artifacts; volume of artifacts in 10×10 cm in the center slice; maximum length of streak artifacts from the fiducial.more » DRR resolution were measured using FWHM and FWTM. 4DCT of PEEK fiducial was acquired with the phantom moving sinusoidally in superior-inferior direction. Motion artifacts were assessed for various 4D phase angles. Results The maximum CT value deviation was −174 for Gold and −24 for PEEK. The volume of artifacts in a 10x10 cm 3 mm slice was 0.369 for Gold and 0.074 cm3 for PEEK. The maximum length of streak artifact was 80mm for Gold and 7 mm for PEEK. PEEK in DRR, FWHM was close to actual (1.0 mm for Gold and 1.1 mm for PEEK). FWTM was 1.8 mm for Gold and 1.3 mm for PEEK in DRR. Barrel motion artifact of PEEK fiducial was noticeable for free-breathing scan. The apparent PEEK length due to residual motion was in close agreement with the calculated length (13 mm for 30–70 phase, 10 mm in 40–60 phase). Conclusion Streak artifacts on planning CT associated with use of gold fiducial can be significantly reduced by PEEK fiducial, while having adequate kV image contrast. DRR image resolution at FWTM was improved from 1.8 mm to 1.3 mm. Because of this improvement, we have been routinely use PEEK for liver IGRT.« less
Huang, Ai-Mei; Nguyen, Truong
2009-04-01
In this paper, we address the problems of unreliable motion vectors that cause visual artifacts but cannot be detected by high residual energy or bidirectional prediction difference in motion-compensated frame interpolation. A correlation-based motion vector processing method is proposed to detect and correct those unreliable motion vectors by explicitly considering motion vector correlation in the motion vector reliability classification, motion vector correction, and frame interpolation stages. Since our method gradually corrects unreliable motion vectors based on their reliability, we can effectively discover the areas where no motion is reliable to be used, such as occlusions and deformed structures. We also propose an adaptive frame interpolation scheme for the occlusion areas based on the analysis of their surrounding motion distribution. As a result, the interpolated frames using the proposed scheme have clearer structure edges and ghost artifacts are also greatly reduced. Experimental results show that our interpolated results have better visual quality than other methods. In addition, the proposed scheme is robust even for those video sequences that contain multiple and fast motions.
Negishi, Michiro; Abildgaard, Mark; Laufer, Ilan; Nixon, Terry; Constable, Robert Todd
2008-01-01
Simultaneous EEG-fMRI (Electroencephalography-functional Magnetic Resonance Imaging) recording provides a means for acquiring high temporal resolution electrophysiological data and high spatial resolution metabolic data of the brain in the same experimental runs. Carbon wire electrodes (not metallic EEG electrodes with carbon wire leads) are suitable for simultaneous EEG-fMRI recording, because they cause less RF (radio-frequency) heating and susceptibility artifacts than metallic electrodes. These characteristics are especially desirable for recording the EEG in high field MRI scanners. Carbon wire electrodes are also comfortable to wear during long recording sessions. However, carbon electrodes have high electrode-electrolyte potentials compared to widely used Ag/AgCl (silver/silver-chloride) electrodes, which may cause slow voltage drifts. This paper introduces a prototype EEG recording system with carbon wire electrodes and a circuit that suppresses the slow voltage drift. The system was tested for the voltage drift, RF heating, susceptibility artifact, and impedance, and was also evaluated in a simultaneous ERP (event-related potential)-fMRI experiment. PMID:18588913
Zhang, Zhilin; Pi, Zhouyue; Liu, Benyuan
2015-02-01
Heart rate monitoring using wrist-type photoplethysmographic signals during subjects' intensive exercise is a difficult problem, since the signals are contaminated by extremely strong motion artifacts caused by subjects' hand movements. So far few works have studied this problem. In this study, a general framework, termed TROIKA, is proposed, which consists of signal decomposiTion for denoising, sparse signal RecOnstructIon for high-resolution spectrum estimation, and spectral peaK trAcking with verification. The TROIKA framework has high estimation accuracy and is robust to strong motion artifacts. Many variants can be straightforwardly derived from this framework. Experimental results on datasets recorded from 12 subjects during fast running at the peak speed of 15 km/h showed that the average absolute error of heart rate estimation was 2.34 beat per minute, and the Pearson correlation between the estimates and the ground truth of heart rate was 0.992. This framework is of great values to wearable devices such as smartwatches which use PPG signals to monitor heart rate for fitness.
Sampling limits for electron tomography with sparsity-exploiting reconstructions.
Jiang, Yi; Padgett, Elliot; Hovden, Robert; Muller, David A
2018-03-01
Electron tomography (ET) has become a standard technique for 3D characterization of materials at the nano-scale. Traditional reconstruction algorithms such as weighted back projection suffer from disruptive artifacts with insufficient projections. Popularized by compressed sensing, sparsity-exploiting algorithms have been applied to experimental ET data and show promise for improving reconstruction quality or reducing the total beam dose applied to a specimen. Nevertheless, theoretical bounds for these methods have been less explored in the context of ET applications. Here, we perform numerical simulations to investigate performance of ℓ 1 -norm and total-variation (TV) minimization under various imaging conditions. From 36,100 different simulated structures, our results show specimens with more complex structures generally require more projections for exact reconstruction. However, once sufficient data is acquired, dividing the beam dose over more projections provides no improvements-analogous to the traditional dose-fraction theorem. Moreover, a limited tilt range of ±75° or less can result in distorting artifacts in sparsity-exploiting reconstructions. The influence of optimization parameters on reconstructions is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Adaptive color demosaicing and false color removal
NASA Astrophysics Data System (ADS)
Guarnera, Mirko; Messina, Giuseppe; Tomaselli, Valeria
2010-04-01
Color interpolation solutions drastically influence the quality of the whole image generation pipeline, so they must guarantee the rendering of high quality pictures by avoiding typical artifacts such as blurring, zipper effects, and false colors. Moreover, demosaicing should avoid emphasizing typical artifacts of real sensors data, such as noise and green imbalance effect, which would be further accentuated by the subsequent steps of the processing pipeline. We propose a new adaptive algorithm that decides the interpolation technique to apply to each pixel, according to its neighborhood analysis. Edges are effectively interpolated through a directional filtering approach that interpolates the missing colors, selecting the suitable filter depending on edge orientation. Regions close to edges are interpolated through a simpler demosaicing approach. Thus flat regions are identified and low-pass filtered to eliminate some residual noise and to minimize the annoying green imbalance effect. Finally, an effective false color removal algorithm is used as a postprocessing step to eliminate residual color errors. The experimental results show how sharp edges are preserved, whereas undesired zipper effects are reduced, improving the edge resolution itself and obtaining superior image quality.
Lorenz, Kevin S.; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.
2013-01-01
Digital image analysis is a fundamental component of quantitative microscopy. However, intravital microscopy presents many challenges for digital image analysis. In general, microscopy volumes are inherently anisotropic, suffer from decreasing contrast with tissue depth, lack object edge detail, and characteristically have low signal levels. Intravital microscopy introduces the additional problem of motion artifacts, resulting from respiratory motion and heartbeat from specimens imaged in vivo. This paper describes an image registration technique for use with sequences of intravital microscopy images collected in time-series or in 3D volumes. Our registration method involves both rigid and non-rigid components. The rigid registration component corrects global image translations, while the non-rigid component manipulates a uniform grid of control points defined by B-splines. Each control point is optimized by minimizing a cost function consisting of two parts: a term to define image similarity, and a term to ensure deformation grid smoothness. Experimental results indicate that this approach is promising based on the analysis of several image volumes collected from the kidney, lung, and salivary gland of living rodents. PMID:22092443
Analysis of cerebral vessels dynamics using experimental data with missed segments
NASA Astrophysics Data System (ADS)
Pavlova, O. N.; Abdurashitov, A. S.; Ulanova, M. V.; Shihalov, G. M.; Semyachkina-Glushkovskaya, O. V.; Pavlov, A. N.
2018-04-01
Physiological signals often contain various bad segments that occur due to artifacts, failures of the recording equipment or varying experimental conditions. The related experimental data need to be preprocessed to avoid such parts of recordings. In the case of few bad segments, they can simply be removed from the signal and its analysis is further performed. However, when there are many extracted segments, the internal structure of the analyzed physiological process may be destroyed, and it is unclear whether such signal can be used in diagnostic-related studies. In this paper we address this problem for the case of cerebral vessels dynamics. We perform analysis of simulated data in order to reveal general features of quantifying scaling features of complex signals with distinct correlation properties and show that the effects of data loss are significantly different for experimental data with long-range correlations and anti-correlations. We conclude that the cerebral vessels dynamics is significantly less sensitive to missed data fragments as compared with signals with anti-correlated statistics.
Is an observed non-co-linear RNA product spliced in trans, in cis or just in vitro?
Yu, Chun-Ying; Liu, Hsiao-Jung; Hung, Li-Yuan; Kuo, Hung-Chih; Chuang, Trees-Juen
2014-01-01
Global transcriptome investigations often result in the detection of an enormous number of transcripts composed of non-co-linear sequence fragments. Such ‘aberrant’ transcript products may arise from post-transcriptional events or genetic rearrangements, or may otherwise be false positives (sequencing/alignment errors or in vitro artifacts). Moreover, post-transcriptionally non-co-linear (‘PtNcl’) transcripts can arise from trans-splicing or back-splicing in cis (to generate so-called ‘circular RNA’). Here, we collected previously-predicted human non-co-linear RNA candidates, and designed a validation procedure integrating in silico filters with multiple experimental validation steps to examine their authenticity. We showed that >50% of the tested candidates were in vitro artifacts, even though some had been previously validated by RT-PCR. After excluding the possibility of genetic rearrangements, we distinguished between trans-spliced and circular RNAs, and confirmed that these two splicing forms can share the same non-co-linear junction. Importantly, the experimentally-confirmed PtNcl RNA events and their corresponding PtNcl splicing types (i.e. trans-splicing, circular RNA, or both sharing the same junction) were all expressed in rhesus macaque, and some were even expressed in mouse. Our study thus describes an essential procedure for confirming PtNcl transcripts, and provides further insight into the evolutionary role of PtNcl RNA events, opening up this important, but understudied, class of post-transcriptional events for comprehensive characterization. PMID:25053845
NASA Astrophysics Data System (ADS)
Valderrama, Joaquin T.; de la Torre, Angel; Van Dun, Bram
2018-02-01
Objective. Artifact reduction in electroencephalogram (EEG) signals is usually necessary to carry out data analysis appropriately. Despite the large amount of denoising techniques available with a multichannel setup, there is a lack of efficient algorithms that remove (not only detect) blink-artifacts from a single channel EEG, which is of interest in many clinical and research applications. This paper describes and evaluates the iterative template matching and suppression (ITMS), a new method proposed for detecting and suppressing the artifact associated with the blink activity from a single channel EEG. Approach. The approach of ITMS consists of (a) an iterative process in which blink-events are detected and the blink-artifact waveform of the analyzed subject is estimated, (b) generation of a signal modeling the blink-artifact, and (c) suppression of this signal from the raw EEG. The performance of ITMS is compared with the multi-window summation of derivatives within a window (MSDW) technique using both synthesized and real EEG data. Main results. Results suggest that ITMS presents an adequate performance in detecting and suppressing blink-artifacts from a single channel EEG. When applied to the analysis of cortical auditory evoked potentials (CAEPs), ITMS provides a significant quality improvement in the resulting responses, i.e. in a cohort of 30 adults, the mean correlation coefficient improved from 0.37 to 0.65 when the blink-artifacts were detected and suppressed by ITMS. Significance. ITMS is an efficient solution to the problem of denoising blink-artifacts in single-channel EEG applications, both in clinical and research fields. The proposed ITMS algorithm is stable; automatic, since it does not require human intervention; low-invasive, because the EEG segments not contaminated by blink-artifacts remain unaltered; and easy to implement, as can be observed in the Matlab script implemeting the algorithm provided as supporting material.
Ziemann, Christian; Stille, Maik; Cremers, Florian; Buzug, Thorsten M; Rades, Dirk
2018-04-17
Metal artifacts caused by high-density implants lead to incorrectly reconstructed Hounsfield units in computed tomography images. This can result in a loss of accuracy in dose calculation in radiation therapy. This study investigates the potential of the metal artifact reduction algorithms, Augmented Likelihood Image Reconstruction and linear interpolation, in improving dose calculation in the presence of metal artifacts. In order to simulate a pelvis with a double-sided total endoprosthesis, a polymethylmethacrylate phantom was equipped with two steel bars. Artifacts were reduced by applying the Augmented Likelihood Image Reconstruction, a linear interpolation, and a manual correction approach. Using the treatment planning system Eclipse™, identical planning target volumes for an idealized prostate as well as structures for bladder and rectum were defined in corrected and noncorrected images. Volumetric modulated arc therapy plans have been created with double arc rotations with and without avoidance sectors that mask out the prosthesis. The irradiation plans were analyzed for variations in the dose distribution and their homogeneity. Dosimetric measurements were performed using isocentric positioned ionization chambers. Irradiation plans based on images containing artifacts lead to a dose error in the isocenter of up to 8.4%. Corrections with the Augmented Likelihood Image Reconstruction reduce this dose error to 2.7%, corrections with linear interpolation to 3.2%, and manual artifact correction to 4.1%. When applying artifact correction, the dose homogeneity was slightly improved for all investigated methods. Furthermore, the calculated mean doses are higher for rectum and bladder if avoidance sectors are applied. Streaking artifacts cause an imprecise dose calculation within irradiation plans. Using a metal artifact correction algorithm, the planning accuracy can be significantly improved. Best results were accomplished using the Augmented Likelihood Image Reconstruction algorithm. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jani, S
Purpose: CT simulation for patients with metal implants can often be challenging due to artifacts that obscure tumor/target delineation and normal organ definition. Our objective was to evaluate the effectiveness of Orthopedic Metal Artifact Reduction (OMAR), a commercially available software, in reducing metal-induced artifacts and its effect on computed dose during treatment planning. Methods: CT images of water surrounding metallic cylindrical rods made of aluminum, copper and iron were studied in terms of Hounsfield Units (HU) spread. Metal-induced artifacts were characterized in terms of HU/Volume Histogram (HVH) using the Pinnacle treatment planning system. Effects of OMAR on enhancing our abilitymore » to delineate organs on CT and subsequent dose computation were examined in nine (9) patients with hip implants and two (2) patients with breast tissue expanders. Results: Our study characterized water at 1000 HU with a standard deviation (SD) of about 20 HU. The HVHs allowed us to evaluate how the presence of metal changed the HU spread. For example, introducing a 2.54 cm diameter copper rod in water increased the SD in HU of the surrounding water from 20 to 209, representing an increase in artifacts. Subsequent use of OMAR brought the SD down to 78. Aluminum produced least artifacts whereas Iron showed largest amount of artifacts. In general, an increase in kVp and mA during CT scanning showed better effectiveness of OMAR in reducing artifacts. Our dose analysis showed that some isodose contours shifted by several mm with OMAR but infrequently and were nonsignificant in planning process. Computed volumes of various dose levels showed <2% change. Conclusions: In our experience, OMAR software greatly reduced the metal-induced CT artifacts for the majority of patients with implants, thereby improving our ability to delineate tumor and surrounding organs. OMAR had a clinically negligible effect on computed dose within tissues. Partially funded by unrestricted educational grant from Philips.« less
TU-H-206-01: An Automated Approach for Identifying Geometric Distortions in Gamma Cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, S; Nelson, J; Samei, E
2016-06-15
Purpose: To develop a clinically-deployable, automated process for detecting artifacts in routine nuclear medicine (NM) quality assurance (QA) bar phantom images. Methods: An artifact detection algorithm was created to analyze bar phantom images as part of an ongoing QA program. A low noise, high resolution reference image was acquired from an x-ray of the bar phantom with a Philips Digital Diagnost system utilizing image stitching. NM bar images, acquired for 5 million counts over a 512×512 matrix, were registered to the template image by maximizing mutual information (MI). The MI index was used as an initial test for artifacts; lowmore » values indicate an overall presence of distortions regardless of their spatial location. Images with low MI scores were further analyzed for bar linearity, periodicity, alignment, and compression to locate differences with respect to the template. Findings from each test were spatially correlated and locations failing multiple tests were flagged as potential artifacts requiring additional visual analysis. The algorithm was initially deployed for GE Discovery 670 and Infinia Hawkeye gamma cameras. Results: The algorithm successfully identified clinically relevant artifacts from both systems previously unnoticed by technologists performing the QA. Average MI indices for artifact-free images are 0.55. Images with MI indices < 0.50 have shown 100% sensitivity and specificity for artifact detection when compared with a thorough visual analysis. Correlation of geometric tests confirms the ability to spatially locate the most likely image regions containing an artifact regardless of initial phantom orientation. Conclusion: The algorithm shows the potential to detect gamma camera artifacts that may be missed by routine technologist inspections. Detection and subsequent correction of artifacts ensures maximum image quality and may help to identify failing hardware before it impacts clinical workflow. Going forward, the algorithm is being deployed to monitor data from all gamma cameras within our health system.« less
SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, C; Qi, H; Chen, Z
Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using meanmore » filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.« less
Critical Evaluation of Kinetic Method Measurements: Possible Origins of Nonlinear Effects
NASA Astrophysics Data System (ADS)
Bourgoin-Voillard, Sandrine; Afonso, Carlos; Lesage, Denis; Zins, Emilie-Laure; Tabet, Jean-Claude; Armentrout, P. B.
2013-03-01
The kinetic method is a widely used approach for the determination of thermochemical data such as proton affinities (PA) and gas-phase acidities ( ΔH° acid ). These data are easily obtained from decompositions of noncovalent heterodimers if care is taken in the choice of the method, references used, and experimental conditions. Previously, several papers have focused on theoretical considerations concerning the nature of the references. Few investigations have been devoted to conditions required to validate the quality of the experimental results. In the present work, we are interested in rationalizing the origin of nonlinear effects that can be obtained with the kinetic method. It is shown that such deviations result from intrinsic properties of the systems investigated but can also be enhanced by artifacts resulting from experimental issues. Overall, it is shown that orthogonal distance regression (ODR) analysis of kinetic method data provides the optimum way of acquiring accurate thermodynamic information.
Li, Chengwei; Zhan, Liwei
2015-08-01
To estimate the coefficient of friction between tire and runway surface during airplane touchdowns, we designed an experimental rig to simulate such events and to record the impact and friction forces being executed. Because of noise in the measured signals, we developed a filtering method that is based on the ensemble empirical mode decomposition and the bandwidth of probability density function of each intrinsic mode function to extract friction and impact force signals. We can quantify the coefficient of friction by calculating the maximum values of the filtered force signals. Signal measurements are recorded for different drop heights and tire rotational speeds, and the corresponding coefficient of friction is calculated. The result shows that the values of the coefficient of friction change only slightly. The random noise and experimental artifact are the major reason of the change.
Analyzing EEG and MEG signals recorded during tES, a reply.
Noury, Nima; Siegel, Markus
2018-02-15
Transcranial Electric Stimulation (tES) is a widely used non-invasive brain stimulation technique. However, strong stimulation artifacts complicate the investigation of neural activity with EEG or MEG during tES. Thus, studying brain signals during tES requires detailed knowledge about the properties of these artifacts. Recently, we characterized the phase- and amplitude-relationship between tES stimulation currents and tES artifacts in EEG and MEG and provided a mathematical model of these artifacts (Noury and Siegel, 2017, and Noury et al., 2016, respectively). Among several other features, we showed that, independent of the stimulation current, the amplitude of tES artifacts is modulated time locked to heartbeat and respiration. In response to our work, a recent paper (Neuling et al., 2017) raised several points concerning the employed stimulation device and methodology. Here, we discuss these points, explain potential misunderstandings, and show that none of the raised concerns are applicable to our results. Furthermore, we explain in detail the physics underlying tES artifacts, and discuss several approaches how to study brain function during tES in the presence of residual artifacts. Copyright © 2017 Elsevier Inc. All rights reserved.
Huang, Chih-Sheng; Yang, Wen-Yu; Chuang, Chun-Hsiang; Wang, Yu-Kai
2018-01-01
Electroencephalogram (EEG) signals are usually contaminated with various artifacts, such as signal associated with muscle activity, eye movement, and body motion, which have a noncerebral origin. The amplitude of such artifacts is larger than that of the electrical activity of the brain, so they mask the cortical signals of interest, resulting in biased analysis and interpretation. Several blind source separation methods have been developed to remove artifacts from the EEG recordings. However, the iterative process for measuring separation within multichannel recordings is computationally intractable. Moreover, manually excluding the artifact components requires a time-consuming offline process. This work proposes a real-time artifact removal algorithm that is based on canonical correlation analysis (CCA), feature extraction, and the Gaussian mixture model (GMM) to improve the quality of EEG signals. The CCA was used to decompose EEG signals into components followed by feature extraction to extract representative features and GMM to cluster these features into groups to recognize and remove artifacts. The feasibility of the proposed algorithm was demonstrated by effectively removing artifacts caused by blinks, head/body movement, and chewing from EEG recordings while preserving the temporal and spectral characteristics of the signals that are important to cognitive research. PMID:29599950
Sutherland-Smith, James; Tilley, Brenda
2012-01-01
Magnetic resonance imaging (MRI) artifacts secondary to metallic implants and foreign bodies are well described. Herein, we provide quantitative data from veterinary implants including total hip arthroplasty implants, cranial cruciate repair implants, surgical screws, a skin staple, ligation clips, an identification microchip, ameroid constrictor, and potential foreign bodies including air gun and BB projectiles and a sewing needle. The objects were scanned in a gelatin phantom with plastic grid using standardized T2-weighted turbo-spin echo (TSE), T1-weighted spin echo, and T2*-weighted gradient recalled echo (GRE) image acquisitions at 1.5 T. Maximum linear dimensions and areas of signal voiding and grid distortion were calculated using a DICOM workstation for each sequence and object. Artifact severity was similar between the T2-weighted TSE and T1-weighted images, while the T2*-weighted images were most susceptible to artifact. Metal type influenced artifact size with the largest artifacts arising from steel objects followed by surgical stainless steel, titanium, and lead. For animals with metallic surgical implants or foreign bodies, the quantification of the artifact size will help guide clinicians on the viability of MRI. © 2012 Veterinary Radiology & Ultrasound.
Posatskiy, A O; Chau, T
2012-04-01
Mechanomyography (MMG) is an important kinesiological tool and potential communication pathway for individuals with disabilities. However, MMG is highly susceptible to contamination by motion artifact due to limb movement. A better understanding of the nature of this contamination and its effects on different sensing methods is required to inform robust MMG sensor design. Therefore, in this study, we recorded MMG from the extensor carpi ulnaris of six able-bodied participants using three different co-located condenser microphone and accelerometer pairings. Contractions at 30% MVC were recorded with and without a shaker-induced single-frequency forearm motion artifact delivered via a custom test rig. Using a signal-to-signal-plus-noise-ratio and the adaptive Neyman curve-based statistic, we found that microphone-derived MMG spectra were significantly less influenced by motion artifact than corresponding accelerometer-derived spectra (p⩽0.05). However, non-vanishing motion artifact harmonics were present in both spectra, suggesting that simple bandpass filtering may not remove artifact influences permeating into typical MMG bands of interest. Our results suggest that condenser microphones are preferred for MMG recordings when the mitigation of motion artifact effects is important. Copyright © 2011. Published by Elsevier Ltd.
Ritchie, Nicholas W M; Newbury, Dale E; Lindstrom, Abigail P
2011-12-01
Artifacts are the nemesis of trace element analysis in electron-excited energy dispersive X-ray spectrometry. Peaks that result from nonideal behavior in the detector or sample can fool even an experienced microanalyst into believing that they have trace amounts of an element that is not present. Many artifacts, such as the Si escape peak, absorption edges, and coincidence peaks, can be traced to the detector. Others, such as secondary fluorescence peaks and scatter peaks, can be traced to the sample. We have identified a new sample-dependent artifact that we attribute to Compton scattering of energetic X-rays generated in a small feature and subsequently scattered from a low atomic number matrix. It seems likely that this artifact has not previously been reported because it only occurs under specific conditions and represents a relatively small signal. However, with the advent of silicon drift detectors and their utility for trace element analysis, we anticipate that more people will observe it and possibly misidentify it. Though small, the artifact is not inconsequential. Under some conditions, it is possible to mistakenly identify the Compton scatter artifact as approximately 1% of an element that is not present.
Roy, Vandana; Shukla, Shailja; Shukla, Piyush Kumar; Rawat, Paresh
2017-01-01
The motion generated at the capturing time of electro-encephalography (EEG) signal leads to the artifacts, which may reduce the quality of obtained information. Existing artifact removal methods use canonical correlation analysis (CCA) for removing artifacts along with ensemble empirical mode decomposition (EEMD) and wavelet transform (WT). A new approach is proposed to further analyse and improve the filtering performance and reduce the filter computation time under highly noisy environment. This new approach of CCA is based on Gaussian elimination method which is used for calculating the correlation coefficients using backslash operation and is designed for EEG signal motion artifact removal. Gaussian elimination is used for solving linear equation to calculate Eigen values which reduces the computation cost of the CCA method. This novel proposed method is tested against currently available artifact removal techniques using EEMD-CCA and wavelet transform. The performance is tested on synthetic and real EEG signal data. The proposed artifact removal technique is evaluated using efficiency matrices such as del signal to noise ratio (DSNR), lambda ( λ ), root mean square error (RMSE), elapsed time, and ROC parameters. The results indicate suitablity of the proposed algorithm for use as a supplement to algorithms currently in use.
Artifact-Based Transformation of IBM Global Financing
NASA Astrophysics Data System (ADS)
Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.
IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.
Artifacts Affecting Musculoskeletal Magnetic Resonance Imaging: Their Origins and Solutions.
Roth, Eira; Hoff, Michael; Richardson, Michael L; Ha, Alice S; Porrino, Jack
2016-01-01
Among articles within the radiology literature, few present the manifestations of magnetic resonance imaging artifacts in a clinically oriented manner. Recognizing such artifacts is imperative given the increasing clinical use of magnetic resonance imaging and the emphasis by the American Board of Radiology on practical physics applications. The purpose of this article is to present magnetic resonance physics principles visually and conceptually in the context of common musculoskeletal radiology artifacts and their solutions, described using nonmathematical explanations. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Peter C.; Schreibmann, Eduard; Roper, Justin
2015-03-15
Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR.more » Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.« less
Yang, Ping; Dumont, Guy A; Ansermino, J Mark
2009-04-01
Intraoperative heart rate is routinely measured independently from the ECG monitor, pulse oximeter, and the invasive blood pressure monitor if available. The presence of artifacts, in one or more of theses signals, especially sustained artifacts, represents a critical challenge for physiological monitoring. When temporal filters are used to suppress sustained artifacts, unwanted delays or signal distortion are often introduced. The aim of this study was to remove artifacts and derive accurate estimates for the heart rate signal by using measurement redundancy. Heart rate measurements from multiple sensors and previous estimates that fall in a short moving window were treated as samples of the same heart rate. A hybrid median filter was used to align these samples into one ordinal series and to select the median as the fused estimate. This method can successfully remove artifacts that are sustained for shorter than half the length of the filter window, or artifacts that are sustained for a longer duration but presented in no more than half of the sensors. The method was tested on both simulated and clinical cases. The performance of the hybrid median filter in the simulated study was compared with that of a two-step estimation process, comprising a threshold-controlled artifact-removal module and a Kalman filter. The estimation accuracy of the hybrid median filter is better than that of the Kalman filter in the presence of artifacts. The hybrid median filter combines the structural and temporal information from two or more sensors and generates a robust estimate of heart rate without requiring strict assumptions about the signal's characteristics. This method is intuitive, computationally simple, and the performance can be easily adjusted. These considerable benefits make this method highly suitable for clinical use.
Sundseth, Jarle; Jacobsen, Eva A; Kolstad, Frode; Nygaard, Oystein P; Zwart, John A; Hol, Per K
2013-10-01
Cervical disc prostheses induce significant amount of artifact in magnetic resonance imaging which may complicate radiologic follow-up after surgery. The purpose of this study was to investigate as to what extent the artifact, induced by the frequently used Discover(®) cervical disc prosthesis, impedes interpretation of the MR images at operated and adjacent levels in 1.5 and 3 Tesla MR. Ten subsequent patients were investigated in both 1.5 and 3 Tesla MR with standard image sequences one year following anterior cervical discectomy with arthroplasty. Two neuroradiologists evaluated the images by consensus. Emphasis was made on signal changes in medulla at all levels and visualization of root canals at operated and adjacent levels. A "blur artifact ratio" was calculated and defined as the height of the artifact on T1 sagittal images related to the operated level. The artifacts induced in 1.5 and 3 Tesla MR were of entirely different character and evaluation of the spinal cord at operated level was impossible in both magnets. Artifacts also made the root canals difficult to assess at operated level and more pronounced in the 3 Tesla MR. At the adjacent levels however, the spinal cord and root canals were completely visualized in all patients. The "blur artifact" induced at operated level was also more pronounced in the 3 Tesla MR. The artifact induced by the Discover(®) titanium disc prosthesis in both 1.5 and 3 Tesla MR, makes interpretation of the spinal cord impossible and visualization of the root canals difficult at operated level. Adjusting the MR sequences to produce the least amount of artifact is important.
Artifact reduction of different metallic implants in flat detector C-arm CT.
Hung, S-C; Wu, C-C; Lin, C-J; Guo, W-Y; Luo, C-B; Chang, F-C; Chang, C-Y
2014-07-01
Flat detector CT has been increasingly used as a follow-up examination after endovascular intervention. Metal artifact reduction has been successfully demonstrated in coil mass cases, but only in a small series. We attempted to objectively and subjectively evaluate the feasibility of metal artifact reduction with various metallic objects and coil lengths. We retrospectively reprocessed the flat detector CT data of 28 patients (15 men, 13 women; mean age, 55.6 years) after they underwent endovascular treatment (20 coiling ± stent placement, 6 liquid embolizers) or shunt drainage (n = 2) between January 2009 and November 2011 by using a metal artifact reduction correction algorithm. We measured CT value ranges and noise by using region-of-interest methods, and 2 experienced neuroradiologists rated the degrees of improved imaging quality and artifact reduction by comparing uncorrected and corrected images. After we applied the metal artifact reduction algorithm, the CT value ranges and the noise were substantially reduced (1815.3 ± 793.7 versus 231.7 ± 95.9 and 319.9 ± 136.6 versus 45.9 ± 14.0; both P < .001) regardless of the types of metallic objects and various sizes of coil masses. The rater study achieved an overall improvement of imaging quality and artifact reduction (85.7% and 78.6% of cases by 2 raters, respectively), with the greatest improvement in the coiling group, moderate improvement in the liquid embolizers, and the smallest improvement in ventricular shunting (overall agreement, 0.857). The metal artifact reduction algorithm substantially reduced artifacts and improved the objective image quality in every studied case. It also allowed improved diagnostic confidence in most cases. © 2014 by American Journal of Neuroradiology.
Automatic EEG artifact removal: a weighted support vector machine approach with error correction.
Shao, Shi-Yun; Shen, Kai-Quan; Ong, Chong Jin; Wilder-Smith, Einar P V; Li, Xiao-Ping
2009-02-01
An automatic electroencephalogram (EEG) artifact removal method is presented in this paper. Compared to past methods, it has two unique features: 1) a weighted version of support vector machine formulation that handles the inherent unbalanced nature of component classification and 2) the ability to accommodate structural information typically found in component classification. The advantages of the proposed method are demonstrated on real-life EEG recordings with comparisons made to several benchmark methods. Results show that the proposed method is preferable to the other methods in the context of artifact removal by achieving a better tradeoff between removing artifacts and preserving inherent brain activities. Qualitative evaluation of the reconstructed EEG epochs also demonstrates that after artifact removal inherent brain activities are largely preserved.
Cultural Artifact Detection in Long Wave Infrared Imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dylan Zachary; Craven, Julia M.; Ramon, Eric
2017-01-01
Detection of cultural artifacts from airborne remotely sensed data is an important task in the context of on-site inspections. Airborne artifact detection can reduce the size of the search area the ground based inspection team must visit, thereby improving the efficiency of the inspection process. This report details two algorithms for detection of cultural artifacts in aerial long wave infrared imagery. The first algorithm creates an explicit model for cultural artifacts, and finds data that fits the model. The second algorithm creates a model of the background and finds data that does not fit the model. Both algorithms are appliedmore » to orthomosaic imagery generated as part of the MSFE13 data collection campaign under the spectral technology evaluation project.« less
Zhuge, Xiaodong; Jinnai, Hiroshi; Dunin-Borkowski, Rafal E; Migunov, Vadim; Bals, Sara; Cool, Pegie; Bons, Anton-Jan; Batenburg, Kees Joost
2017-04-01
Electron tomography is an essential imaging technique for the investigation of morphology and 3D structure of nanomaterials. This method, however, suffers from well-known missing wedge artifacts due to a restricted tilt range, which limits the objectiveness, repeatability and efficiency of quantitative structural analysis. Discrete tomography represents one of the promising reconstruction techniques for materials science, potentially capable of delivering higher fidelity reconstructions by exploiting the prior knowledge of the limited number of material compositions in a specimen. However, the application of discrete tomography to practical datasets remains a difficult task due to the underlying challenging mathematical problem. In practice, it is often hard to obtain consistent reconstructions from experimental datasets. In addition, numerous parameters need to be tuned manually, which can lead to bias and non-repeatability. In this paper, we present the application of a new iterative reconstruction technique, named TVR-DART, for discrete electron tomography. The technique is capable of consistently delivering reconstructions with significantly reduced missing wedge artifacts for a variety of challenging data and imaging conditions, and can automatically estimate its key parameters. We describe the principles of the technique and apply it to datasets from three different types of samples acquired under diverse imaging modes. By further reducing the available tilt range and number of projections, we show that the proposed technique can still produce consistent reconstructions with minimized missing wedge artifacts. This new development promises to provide the electron microscopy community with an easy-to-use and robust tool for high-fidelity 3D characterization of nanomaterials. Copyright © 2017 Elsevier B.V. All rights reserved.
Frequency compounding in multifrequency vibroacoustography
NASA Astrophysics Data System (ADS)
Urban, Matthew W.; Alizad, Azra; Fatemi, Mostafa
2009-02-01
Vibro-acoustography is a speckle-free ultrasound based imaging modality that can visualize normal and abnormal soft tissue through mapping stimulated acoustic emission. The acoustic emission is generated by focusing two ultrasound beams of slightly different frequencies (Δf = f1-f2) to the same spatial location and vibrating the tissue as a result of ultrasound radiation force. Reverberation of the acoustic emission can create dark and bright areas in the image that affect overall image contrast and detectability of abnormal tissue. Using finite length tonebursts yields acoustic emission at Δf and at sidebands centered about Δf that originate from the temporal toneburst gating. Separate images are formed by bandpass filtering the acoustic emission at Δf and the associated sidebands. The data at these multiple frequencies are compounded through coherent or incoherent processes to reduce the artifacts associated with reverberation of the acoustic emission. Experimental results from a urethane breast phantom and in vivo human breast scans are shown. The reduction in reverberation artifacts are analyzed using a smoothness metric which uses the variances of the gray levels of the original images and those formed through coherent and incoherent compounding of image data. This smoothness metric is minimized when the overall image background is smooth while image features are still preserved. The smoothness metric indicates that the images improved by factors from 1.23-4.33 and 1.09-2.68 in phantom and in vivo studies, respectively. The coherent and incoherent compounding of multifrequency data demonstrate, both qualitatively and quantitatively, the efficacy of this method for reduction of reverberation artifacts.
Drisdelle, Brandi Lee; Aubin, Sébrina; Jolicoeur, Pierre
2017-01-01
The objective of the present study was to assess the robustness and reliability of independent component analysis (ICA) as a method for ocular artifact correction in electrophysiological studies of visual-spatial attention and memory. The N2pc and sustained posterior contralateral negativity (SPCN), electrophysiological markers of visual-spatial attention and memory, respectively, are lateralized posterior ERPs typically observed following the presentation of lateral stimuli (targets and distractors) along with instructions to maintain fixation on the center of the visual search for the entire trial. Traditionally, trials in which subjects may have displaced their gaze are rejected based on a cutoff threshold, minimizing electrophysiological contamination by saccades. Given the loss of data resulting from rejection, we examined ocular correction by comparing results using standard fixation instructions against a condition where subjects were instructed to shift their gaze toward possible targets. Both conditions were analyzed using a rejection threshold and ICA correction for saccade activity management. Results demonstrate that ICA conserves data that would have otherwise been removed and leaves the underlying neural activity intact, as demonstrated by experimental manipulations previously shown to modulate the N2pc and the SPCN. Not only does ICA salvage and not distort data, but also large eye movements had only subtle effects. Overall, the findings provide convincing evidence for ICA correction for not only special cases (e.g., subjects did not follow fixation instruction) but also as a candidate for standard ocular artifact management in electrophysiological studies interested in visual-spatial attention and memory. © 2016 Society for Psychophysiological Research.
Yildiz, Yesna O; Eckersley, Robert J; Senior, Roxy; Lim, Adrian K P; Cosgrove, David; Tang, Meng-Xing
2015-07-01
Non-linear propagation of ultrasound creates artifacts in contrast-enhanced ultrasound images that significantly affect both qualitative and quantitative assessments of tissue perfusion. This article describes the development and evaluation of a new algorithm to correct for this artifact. The correction is a post-processing method that estimates and removes non-linear artifact in the contrast-specific image using the simultaneously acquired B-mode image data. The method is evaluated on carotid artery flow phantoms with large and small vessels containing microbubbles of various concentrations at different acoustic pressures. The algorithm significantly reduces non-linear artifacts while maintaining the contrast signal from bubbles to increase the contrast-to-tissue ratio by up to 11 dB. Contrast signal from a small vessel 600 μm in diameter buried in tissue artifacts before correction was recovered after the correction. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Artifact correction in diffusion MRI of non-human primate brains on a clinical 3T scanner.
Zhang, Xiaodong; Kirsch, John E; Zhong, Xiaodong
2016-02-01
Smearing artifacts were observed and investigated in diffusion tensor imaging (DTI) studies of macaque monkeys on a clinical whole-body 3T scanner. Four adult macaques were utilized to evaluate DTI artifacts. DTI images were acquired with a single-shot echo-planar imaging (EPI) sequence using a parallel imaging technique. The smearing artifacts observed on the diffusion-weighted images and fractional anisotropy maps were caused by the incomplete fat suppression due to the irregular macaque frontal skull geometry and anatomy. The artifact can be reduced substantially using a novel three-dimensional (3D) shimming procedure. The smearing artifacts observed on diffusion weighted images and fractional anisotropy (FA) maps of macaque brains can be reduced substantially using a robust 3D shimming approach. The DTI protocol combined with the shimming procedure could be a robust approach to examine brain connectivity and white matter integrity of non-human primates using a conventional clinical setting. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ultrafast Ultrasound Imaging Using Combined Transmissions With Cross-Coherence-Based Reconstruction.
Zhang, Yang; Guo, Yuexin; Lee, Wei-Ning
2018-02-01
Plane-wave-based ultrafast imaging has become the prevalent technique for non-conventional ultrasound imaging. The image quality, especially in terms of the suppression of artifacts, is generally compromised by reducing the number of transmissions for a higher frame rate. We hereby propose a new ultrafast imaging framework that reduces not only the side lobe artifacts but also the axial lobe artifacts using combined transmissions with a new coherence-based factor. The results from simulations, in vitro wire phantoms, the ex vivo porcine artery, and the in vivo porcine heart show that our proposed methodology greatly reduced the axial lobe artifact by 25±5 dB compared with coherent plane-wave compounding (CPWC), which was considered as the ultrafast imaging standard, and suppressed side lobe artifacts by 15 ± 5 dB compared with CPWC and coherent spherical-wave compounding. The reduction of artifacts in our proposed ultrafast imaging framework led to a better boundary delineation of soft tissues than CPWC.
Artifacts Quantification of Metal Implants in MRI
NASA Astrophysics Data System (ADS)
Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.
2017-11-01
The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.
NASA Astrophysics Data System (ADS)
Zilber, Nicolas A.; Katayama, Yoshinori; Iramina, Keiji; Erich, Wintermantel
2010-05-01
A new approach is proposed to test the efficiency of methods, such as the Kalman filter and the independent component analysis (ICA), when applied to remove the artifacts induced by transcranial magnetic stimulation (TMS) from electroencephalography (EEG). By using EEG recordings corrupted by TMS induction, the shape of the artifacts is approximately described with a model based on an equivalent circuit simulation. These modeled artifacts are subsequently added to other EEG signals—this time not influenced by TMS. The resulting signals prove of interest since we also know their form without the pseudo-TMS artifacts. Therefore, they enable us to use a fit test to compare the signals we obtain after removing the artifacts with the original signals. This efficiency test turned out very useful in comparing the methods between them, as well as in determining the parameters of the filtering that give satisfactory results with the automatic ICA.
Inoue, Yuuji; Yoneyama, Masami; Nakamura, Masanobu; Takemura, Atsushi
2018-06-01
The two-dimensional Cartesian turbo spin-echo (TSE) sequence is widely used in routine clinical studies, but it is sensitive to respiratory motion. We investigated the k-space orders in Cartesian TSE that can effectively reduce motion artifacts. The purpose of this study was to demonstrate the relationship between k-space order and degree of motion artifacts using a moving phantom. We compared the degree of motion artifacts between linear and asymmetric k-space orders. The actual spacing of ghost artifacts in the asymmetric order was doubled compared with that in the linear order in the free-breathing situation. The asymmetric order clearly showed less sensitivity to incomplete breath-hold at the latter half of the imaging period. Because of the actual number of partitions of the k-space and the temporal filling order, the asymmetric k-space order of Cartesian TSE was superior to the linear k-space order for reduction of ghosting motion artifacts.
A Practical Framework for Cartographic Design
NASA Astrophysics Data System (ADS)
Denil, Mark
2018-05-01
Creation of a map artifact that can be recognized, accepted, read, and absorbed is the cartographer's chief responsibility. This involves bringing coherence and order out of chaos and randomness through the construction of map artifacts that mediate processes of social communication. Maps are artifacts, first and foremost: they are artifacts with particular formal attributes. It is the formal aspects of the map artifact that allows it to invoke and sustain a reading as a map. This paper examines Cartographic Design as the sole means at the cartographer's disposal for constructing the meaning bearing artifacts we know as maps, by placing it in a center of a practical analytic framework. The framework draws together the Theoretic and Craft aspects of map making, and examines how Style and Taste operate through the rubric of a schema of Mapicity to produce high quality maps. The role of the Cartographic Canon, and the role of Critique, are also explored, and a few design resources are identified.
Distributed Cognition and Distributed Morality: Agency, Artifacts and Systems.
Heersmink, Richard
2017-04-01
There are various philosophical approaches and theories describing the intimate relation people have to artifacts. In this paper, I explore the relation between two such theories, namely distributed cognition and distributed morality theory. I point out a number of similarities and differences in these views regarding the ontological status they attribute to artifacts and the larger systems they are part of. Having evaluated and compared these views, I continue by focussing on the way cognitive artifacts are used in moral practice. I specifically conceptualise how such artifacts (a) scaffold and extend moral reasoning and decision-making processes, (b) have a certain moral status which is contingent on their cognitive status, and (c) whether responsibility can be attributed to distributed systems. This paper is primarily written for those interested in the intersection of cognitive and moral theory as it relates to artifacts, but also for those independently interested in philosophical debates in extended and distributed cognition and ethics of (cognitive) technology.
Correction of motion artifacts in OCT-AFI data collected in airways (Conference Presentation)
NASA Astrophysics Data System (ADS)
Abouei, Elham; Lane, Pierre M.; Pahlevaninezhad, Hamid; Lee, Anthony; Lam, Stephen; MacAulay, Calum E.
2016-03-01
Abstract: Optical coherence tomography (OCT) provides in vivo imaging with near-histologic resolution of tissue morphology. OCT has been successfully employed in clinical practice in non-pulmonary fields of medicine such as ophthalmology and cardiology. Studies suggest that OCT has the potential to be a powerful tool for the detection and localization of malignant and non-malignant pulmonary diseases. The combination of OCT with autofluorescence imaging (AFI) provides valuable information about the structural and metabolic state of tissues. Successful application of OCT or OCT-AFI to the field of pulmonary medicine requires overcoming several challenges. This work address those associated with motion: cardiac cycle, breathing and non-uniform rotation distortion (NURD) artifacts. Mechanically rotated endoscopic probes often suffer from image degradation due to NURD. In addition cardiac and breathing motion artifacts may be present in-vivo that are not seen ex-vivo. These motion artifacts can be problematic in OCT-AFI systems with slower acquisition rates and have been observed to generate identifiable prominent artifacts which make confident interpretation of observed structures (blood vessels, etc) difficult. Understanding and correcting motion artifact could improve the image quality and interpretation. In this work, the motion artifacts in pulmonary OCT-AFI data sets are estimated in both AFI and OCT images using a locally adaptive registration algorithm that can be used to correct/reduce such artifacts. Performance of the algorithm is evaluated on images of a NURD phantom and on in-vivo OCT-AFI datasets of peripheral lung airways.
Nakamura, Akihiro; Tanizaki, Yasuo; Takeuchi, Miho; Ito, Shigeru; Sano, Yoshitaka; Sato, Mayumi; Kanno, Toshihiko; Okada, Hiroyuki; Torizuka, Tatsuo; Nishizawa, Sadahiko
2014-06-01
While point spread function (PSF)-based positron emission tomography (PET) reconstruction effectively improves the spatial resolution and image quality of PET, it may damage its quantitative properties by producing edge artifacts, or Gibbs artifacts, which appear to cause overestimation of regional radioactivity concentration. In this report, we investigated how edge artifacts produce negative effects on the quantitative properties of PET. Experiments with a National Electrical Manufacturers Association (NEMA) phantom, containing radioactive spheres of a variety of sizes and background filled with cold air or water, or radioactive solutions, showed that profiles modified by edge artifacts were reproducible regardless of background μ values, and the effects of edge artifacts increased with increasing sphere-to-background radioactivity concentration ratio (S/B ratio). Profiles were also affected by edge artifacts in complex fashion in response to variable combinations of sphere sizes and S/B ratios; and central single-peak overestimation up to 50% was occasionally noted in relatively small spheres with high S/B ratios. Effects of edge artifacts were obscured in spheres with low S/B ratios. In patient images with a variety of focal lesions, areas of higher radioactivity accumulation were generally more enhanced by edge artifacts, but the effects were variable depending on the size of and accumulation in the lesion. PET images generated using PSF-based reconstruction are therefore not appropriate for the evaluation of SUV.
Suppression of stimulus artifact contaminating electrically evoked electromyography.
Liu, Jie; Li, Sheng; Li, Xiaoyan; Klein, Cliff; Rymer, William Z; Zhou, Ping
2014-01-01
Electrical stimulation of muscle or nerve is a very useful technique for understanding of muscle activity and its pathological changes for both diagnostic and therapeutic purposes. During electrical stimulation of a muscle, the recorded M wave is often contaminated by a stimulus artifact. The stimulus artifact must be removed for appropriate analysis and interpretation of M waves. The objective of this study was to develop a novel software based method to remove stimulus artifacts contaminating or superimposing with electrically evoked surface electromyography (EMG) or M wave signals. The multiple stage method uses a series of signal processing techniques, including highlighting and detection of stimulus artifacts using Savitzky-Golay filtering, estimation of the artifact contaminated region with Otsu thresholding, and reconstruction of such region using signal interpolation and smoothing. The developed method was tested using M wave signals recorded from biceps brachii muscles by a linear surface electrode array. To evaluate the performance, a series of semi-synthetic signals were constructed from clean M wave and stimulus artifact recordings with different degrees of overlap between them. The effectiveness of the developed method was quantified by a significant increase in correlation coefficient and a significant decrease in root mean square error between the clean M wave and the reconstructed M wave, compared with those between the clean M wave and the originally contaminated signal. The validity of the developed method was also demonstrated when tested on each channel's M wave recording using a linear electrode array. The developed method can suppress stimulus artifacts contaminating M wave recordings.
Methods for artifact detection and removal from scalp EEG: A review.
Islam, Md Kafiul; Rastegarnia, Amir; Yang, Zhi
2016-11-01
Electroencephalography (EEG) is the most popular brain activity recording technique used in wide range of applications. One of the commonly faced problems in EEG recordings is the presence of artifacts that come from sources other than brain and contaminate the acquired signals significantly. Therefore, much research over the past 15 years has focused on identifying ways for handling such artifacts in the preprocessing stage. However, this is still an active area of research as no single existing artifact detection/removal method is complete or universal. This article presents an extensive review of the existing state-of-the-art artifact detection and removal methods from scalp EEG for all potential EEG-based applications and analyses the pros and cons of each method. First, a general overview of the different artifact types that are found in scalp EEG and their effect on particular applications are presented. In addition, the methods are compared based on their ability to remove certain types of artifacts and their suitability in relevant applications (only functional comparison is provided not performance evaluation of methods). Finally, the future direction and expected challenges of current research is discussed. Therefore, this review is expected to be helpful for interested researchers who will develop and/or apply artifact handling algorithm/technique in future for their applications as well as for those willing to improve the existing algorithms or propose a new solution in this particular area of research. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Allec, N.; Abbaszadeh, S.; Scott, C. C.; Lewin, J. M.; Karim, K. S.
2012-12-01
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
Ma, Junshui; Bayram, Sevinç; Tao, Peining; Svetnik, Vladimir
2011-03-15
After a review of the ocular artifact reduction literature, a high-throughput method designed to reduce the ocular artifacts in multichannel continuous EEG recordings acquired at clinical EEG laboratories worldwide is proposed. The proposed method belongs to the category of component-based methods, and does not rely on any electrooculography (EOG) signals. Based on a concept that all ocular artifact components exist in a signal component subspace, the method can uniformly handle all types of ocular artifacts, including eye-blinks, saccades, and other eye movements, by automatically identifying ocular components from decomposed signal components. This study also proposes an improved strategy to objectively and quantitatively evaluate artifact reduction methods. The evaluation strategy uses real EEG signals to synthesize realistic simulated datasets with different amounts of ocular artifacts. The simulated datasets enable us to objectively demonstrate that the proposed method outperforms some existing methods when no high-quality EOG signals are available. Moreover, the results of the simulated datasets improve our understanding of the involved signal decomposition algorithms, and provide us with insights into the inconsistency regarding the performance of different methods in the literature. The proposed method was also applied to two independent clinical EEG datasets involving 28 volunteers and over 1000 EEG recordings. This effort further confirms that the proposed method can effectively reduce ocular artifacts in large clinical EEG datasets in a high-throughput fashion. Copyright © 2011 Elsevier B.V. All rights reserved.
Allec, N; Abbaszadeh, S; Scott, C C; Lewin, J M; Karim, K S
2012-12-21
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
Li, Qiao; Mark, Roger G; Clifford, Gari D
2009-01-01
Background Within the intensive care unit (ICU), arterial blood pressure (ABP) is typically recorded at different (and sometimes uneven) sampling frequencies, and from different sensors, and is often corrupted by different artifacts and noise which are often non-Gaussian, nonlinear and nonstationary. Extracting robust parameters from such signals, and providing confidences in the estimates is therefore difficult and requires an adaptive filtering approach which accounts for artifact types. Methods Using a large ICU database, and over 6000 hours of simultaneously acquired electrocardiogram (ECG) and ABP waveforms sampled at 125 Hz from a 437 patient subset, we documented six general types of ABP artifact. We describe a new ABP signal quality index (SQI), based upon the combination of two previously reported signal quality measures weighted together. One index measures morphological normality, and the other degradation due to noise. After extracting a 6084-hour subset of clean data using our SQI, we evaluated a new robust tracking algorithm for estimating blood pressure and heart rate (HR) based upon a Kalman Filter (KF) with an update sequence modified by the KF innovation sequence and the value of the SQI. In order to do this, we have created six novel models of different categories of artifacts that we have identified in our ABP waveform data. These artifact models were then injected into clean ABP waveforms in a controlled manner. Clinical blood pressure (systolic, mean and diastolic) estimates were then made from the ABP waveforms for both clean and corrupted data. The mean absolute error for systolic, mean and diastolic blood pressure was then calculated for different levels of artifact pollution to provide estimates of expected errors given a single value of the SQI. Results Our artifact models demonstrate that artifact types have differing effects on systolic, diastolic and mean ABP estimates. We show that, for most artifact types, diastolic ABP estimates are less noise-sensitive than mean ABP estimates, which in turn are more robust than systolic ABP estimates. We also show that our SQI can provide error bounds for both HR and ABP estimates. Conclusion The KF/SQI-fusion method described in this article was shown to provide an accurate estimate of blood pressure and HR derived from the ABP waveform even in the presence of high levels of persistent noise and artifact, and during extreme bradycardia and tachycardia. Differences in error between artifact types, measurement sensors and the quality of the source signal can be factored into physiological estimation using an unbiased adaptive filter, signal innovation and signal quality measures. PMID:19586547
Sutherland, J G H; Miksys, N; Furutani, K M; Thomson, R M
2014-01-01
To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxel and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for (125)I, (103)Pd, and (131)Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for (103)Pd seeds and smallest but still considerable differences for (131)Cs seeds. Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.
Deng, Zhimin; Tian, Tianhai
2014-07-29
The advances of systems biology have raised a large number of sophisticated mathematical models for describing the dynamic property of complex biological systems. One of the major steps in developing mathematical models is to estimate unknown parameters of the model based on experimentally measured quantities. However, experimental conditions limit the amount of data that is available for mathematical modelling. The number of unknown parameters in mathematical models may be larger than the number of observation data. The imbalance between the number of experimental data and number of unknown parameters makes reverse-engineering problems particularly challenging. To address the issue of inadequate experimental data, we propose a continuous optimization approach for making reliable inference of model parameters. This approach first uses a spline interpolation to generate continuous functions of system dynamics as well as the first and second order derivatives of continuous functions. The expanded dataset is the basis to infer unknown model parameters using various continuous optimization criteria, including the error of simulation only, error of both simulation and the first derivative, or error of simulation as well as the first and second derivatives. We use three case studies to demonstrate the accuracy and reliability of the proposed new approach. Compared with the corresponding discrete criteria using experimental data at the measurement time points only, numerical results of the ERK kinase activation module show that the continuous absolute-error criteria using both function and high order derivatives generate estimates with better accuracy. This result is also supported by the second and third case studies for the G1/S transition network and the MAP kinase pathway, respectively. This suggests that the continuous absolute-error criteria lead to more accurate estimates than the corresponding discrete criteria. We also study the robustness property of these three models to examine the reliability of estimates. Simulation results show that the models with estimated parameters using continuous fitness functions have better robustness properties than those using the corresponding discrete fitness functions. The inference studies and robustness analysis suggest that the proposed continuous optimization criteria are effective and robust for estimating unknown parameters in mathematical models.
NASA Astrophysics Data System (ADS)
He, Jia; Xu, You-Lin; Zhan, Sheng; Huang, Qin
2017-03-01
When health monitoring system and vibration control system both are required for a building structure, it will be beneficial and cost-effective to integrate these two systems together for creating a smart building structure. Recently, on the basis of extended Kalman filter (EKF), a time-domain integrated approach was proposed for the identification of structural parameters of the controlled buildings with unknown ground excitations. The identified physical parameters and structural state vectors were then utilized to determine the control force for vibration suppression. In this paper, the possibility of establishing such a smart building structure with the function of simultaneous damage detection and vibration suppression was explored experimentally. A five-story shear building structure equipped with three magneto-rheological (MR) dampers was built. Four additional columns were added to the building model, and several damage scenarios were then simulated by symmetrically cutting off these columns in certain stories. Two sets of earthquakes, i.e. Kobe earthquake and Northridge earthquake, were considered as seismic input and assumed to be unknown during the tests. The structural parameters and the unknown ground excitations were identified during the tests by using the proposed identification method with the measured control forces. Based on the identified structural parameters and system states, a switching control law was employed to adjust the current applied to the MR dampers for the purpose of vibration attenuation. The experimental results show that the presented approach is capable of satisfactorily identifying structural damages and unknown excitations on one hand and significantly mitigating the structural vibration on the other hand.
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
Naturalistic Experience and the Early Use of Symbolic Artifacts
ERIC Educational Resources Information Center
Troseth, Georgene L.; Casey, Amy M.; Lawver, Kelly A.; Walker, Joan M. T.; Cole, David A.
2007-01-01
Experience with a variety of symbolic artifacts has been proposed as a mechanism underlying symbolic development. In this study, the parents of 120 2-year-old children who participated in symbolic object retrieval tasks completed a questionnaire regarding their children's naturalistic experience with symbolic artifacts and activities. In separate…
Camera artifacts in IUE spectra
NASA Technical Reports Server (NTRS)
Bruegman, O. W.; Crenshaw, D. M.
1994-01-01
This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.
Young Children's Rapid Learning about Artifacts
ERIC Educational Resources Information Center
Casler, Krista; Kelemen, Deborah
2005-01-01
Tool use is central to interdisciplinary debates about the evolution and distinctiveness of human intelligence, yet little is actually known about how human conceptions of artifacts develop. Results across these two studies show that even 2-year-olds approach artifacts in ways distinct from captive tool-using monkeys. Contrary to adult intuition,…
Experiments were completed to determine the extent of artifacts from sampling elemental carbon (EC) and organic carbon (OC) under sample conditions consistent with personal sampling. Two different types of experiments were completed; the first examined possible artifacts from oil...
Art[middle dot]I/f/act[middle dot]ology: Curricular Artifacts in Autoethnographic Research
ERIC Educational Resources Information Center
Brogden, Lace Marie
2008-01-01
Contemporary curriculum theorists conceptualize curriculum, schooling, and the teacher as sites of discursive production and as dwelling places for theory. Drawing on memory work around childhood report cards, this article uses commonplace artifacts to reassemble autoethnographic memory. In sifting through memories and artifacts, the author…
The influences of artifact formations and losses on Particulate Matter (PM) sampler collection surfaces are well documented, especially for nitrates (Hering and Cass, 1999), and SVOC's (McDow, 1999), and more recently for speciated carbon (Turpin and Lim, 2001). These artifact...
Historians/Artifacts/Learners: Working Papers.
ERIC Educational Resources Information Center
Nichols, Susan K., Ed.
This publication, an outcome of a 2-day colloquium in 1981, contains information about using artifacts (material culture evidence) as a primary source for teaching history at the graduate or advanced student seminar level. A purpose of the colloquium was to gather and disseminate this information for the Historians/Artifacts/Learners (HAL)…
Examining Student Digital Artifacts during a Year-Long Technology Integration Initiative
ERIC Educational Resources Information Center
Rodriguez, Prisca M.; Frey, Chris; Dawson, Kara; Liu, Feng; Ritzhaupt, Albert D.
2012-01-01
This study was situated within a year-long, statewide technology integration initiative designed to support technology integration within science, technology, engineering, and math classrooms. It examined the elements used in student artifacts in an attempt to investigate trends in digital artifact creation. Among several conclusions, this…
Presenting Cultural Artifacts in the Art Museum: A University-Museum Collaboration
ERIC Educational Resources Information Center
Chung, Sheng Kuan
2009-01-01
With increasing emphasis on multicultural art education and integrative pedagogy, educators have incorporated community resources, such as cultural artifacts exhibited in art museums, to enrich their programs. Cultural artifacts are human-made objects which generally reveal historic information about cultural values, beliefs, and traditions.…
Hakky, Michael; Pandey, Shilpa; Kwak, Ellie; Jara, Hernan; Erbay, Sami H
2013-08-01
This article outlines artifactual findings commonly encountered in neuroradiologic MRI studies and offers clues to differentiate them from true pathology on the basis of their physical properties. Basic MR physics concepts are used to shed light on the causes of these artifacts. MRI is one of the most commonly used techniques in neuroradiology. Unfortunately, MRI is prone to image distortion and artifacts that can be difficult to identify. Using the provided case illustrations, practical clues, and relevant physical applications, radiologists may devise algorithms to troubleshoot these artifacts.
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Kellman, Peter; Dyke, Christopher K.; Aletras, Anthony H.; McVeigh, Elliot R.; Arai, Andrew E.
2007-01-01
Regions of the body with long T1, such as cerebrospinal fluid (CSF), may create ghost artifacts on gadolinium-hyperenhanced images of myocardial infarction when inversion recovery (IR) sequences are used with a segmented acquisition. Oscillations in the transient approach to steady state for regions with long T1 may cause ghosts, with the number of ghosts being equal to the number of segments. B1-weighted phased-array combining provides an inherent degree of ghost artifact suppression because the ghost artifact is weighted less than the desired signal intensity by the coil sensitivity profiles. Example images are shown that illustrate the suppression of CSF ghost artifacts by the use of B1-weighted phased-array combining of multiple receiver coils. PMID:14755669
Striping artifact reduction in lunar orbiter mosaic images
Mlsna, P.A.; Becker, T.
2006-01-01
Photographic images of the moon from the 1960s Lunar Orbiter missions are being processed into maps for visual use. The analog nature of the images has produced numerous artifacts, the chief of which causes a vertical striping pattern in mosaic images formed from a series of filmstrips. Previous methods of stripe removal tended to introduce ringing and aliasing problems in the image data. This paper describes a recently developed alternative approach that succeeds at greatly reducing the striping artifacts while avoiding the creation of ringing and aliasing artifacts. The algorithm uses a one dimensional frequency domain step to deal with the periodic component of the striping artifact and a spatial domain step to handle the aperiodic residue. Several variations of the algorithm have been explored. Results, strengths, and remaining challenges are presented. ?? 2006 IEEE.
Blüthgen, Christian; Sanabria, Sergio; Frauenfelder, Thomas; Klingmüller, Volker; Rominger, Marga
2017-10-01
This project evaluated a low-cost sponge phantom setup for its capability to teach and study A- and B-line reverberation artifacts known from lung ultrasound and to numerically simulate sound wave interaction with the phantom using a finite-difference time-domain (FDTD) model. Both A- and B-line artifacts were reproducible on B-mode ultrasound imaging as well as in the FDTD-based simulation. The phantom was found to be an easy-to-set up and economical tool for understanding, teaching, and researching A- and B-line artifacts occurring in lung ultrasound. The FDTD method-based simulation was able to reproduce the artifacts and provides intuitive insight into the underlying physics. © 2017 by the American Institute of Ultrasound in Medicine.
Adaptive noise canceling of electrocardiogram artifacts in single channel electroencephalogram.
Cho, Sung Pil; Song, Mi Hye; Park, Young Cheol; Choi, Ho Seon; Lee, Kyoung Joung
2007-01-01
A new method for estimating and eliminating electrocardiogram (ECG) artifacts from single channel scalp electroencephalogram (EEG) is proposed. The proposed method consists of emphasis of QRS complex from EEG using least squares acceleration (LSA) filter, generation of synchronized pulse with R-peak and ECG artifacts estimation and elimination using adaptive filter. The performance of the proposed method was evaluated using simulated and real EEG recordings, we found that the ECG artifacts were successfully estimated and eliminated in comparison with the conventional multi-channel techniques, which are independent component analysis (ICA) and ensemble average (EA) method. From this we can conclude that the proposed method is useful for the detecting and eliminating the ECG artifacts from single channel EEG and simple to use for ambulatory/portable EEG monitoring system.
Pulmonary MRA: Differentiation of pulmonary embolism from truncation artifact
Bannas, Peter; Schiebler, Mark L; Motosugi, Utaroh; François, Christopher J; Reeder, Scott B; Nagle, Scott K
2015-01-01
Purpose Truncation artifact (Gibbs ringing) causes central signal drop within vessels in pulmonary MRA that can be mistaken for emboli, reducing the diagnostic accuracy for pulmonary embolism (PE). We propose a quantitative approach to differentiate truncation artifact from PE. Methods Twenty-eight patients who underwent pulmonary CTA for suspected PE were recruited for pulmonary MRA. Signal intensity drops within pulmonary arteries that persisted on both arterial-phase and delayed-phase MRA were identified. The percent signal loss between the vessel lumen and central drop was measured. CTA served as the reference standard for presence of pulmonary emboli. Results A total of 65 signal intensity drops were identified on MRA. 48 (74%) of these were artifact and 17 (26%) were PE, as confirmed by CTA. Truncation artifacts had a significantly lower median signal drop than PE at both arterial-phase (26% [range 12–58%] vs. 85% [range 53–91%]) and at delayed-phase MRA (26% [range 11–55%] vs. 77% [range 47–89%]), p<0.0001 for both. ROC analyses revealed a threshold value of 51% (arterial-phase) and 47%-signal drop (delayed-phase) to differentiate between truncation artifact and PE with 100% sensitivity and >90% specificity. Conclusion Quantitative signal drop is an objective tool to help differentiate truncation artifact and pulmonary embolism in pulmonary MRA. PMID:24863886
Quality assurance in mammography: artifact analysis.
Hogge, J P; Palmer, C H; Muller, C C; Little, S T; Smith, D C; Fatouros, P P; de Paredes, E S
1999-01-01
Evaluation of mammograms for artifacts is essential for mammographic quality assurance. A variety of mammographic artifacts (i.e., variations in mammographic density not caused by true attenuation differences) can occur and can create pseudolesions or mask true abnormalities. Many artifacts are readily identified, whereas others present a true diagnostic challenge. Factors that create artifacts may be related to the processor (eg, static, dirt or excessive developer buildup on the rollers, excessive roller pressure, damp film, scrapes and scratches, incomplete fixing, power failure, contaminated developer), the technologist (eg, improper film handling and loading, improper use of the mammography unit and related equipment, positioning and darkroom errors), the mammography unit (eg, failure of the collimation mirror to rotate, grid inhomogeneity, failure of the reciprocating grid to move, material in the tube housing, compression failure, improper alignment of the compression paddle with the Bucky tray, defective compression paddle), or the patient (e.g., motion, superimposed objects or substances [jewelry, body parts, clothing, hair, implanted medical devices, foreign bodies, substances on the skin]). Familiarity with the broad range of artifacts and the measures required to eliminate them is vital. Careful attention to darkroom cleanliness, care in film handling, regularly scheduled processor maintenance and chemical replenishment, daily quality assurance activities, and careful attention to detail during patient positioning and mammography can reduce or eliminate most mammographic artifacts.
MARSAME Radiological Release Report for Archaeological Artifacts Excavated from Area L
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruedig, Elizabeth; Whicker, Jeffrey Jay; Gillis, Jessica Mcdonnel
In 1991 Los Alamos National Laboratory’s (LANL’s) cultural resources team excavated archaeological site LA 4618 located at Technical Area 54, within Material Disposal Area L (MDA L). MDA L received non-radioactive chemical waste from the early 1960s until 1985. Further development of the MDA required excavation of several cultural sites under National Historic Preservation Act requirements; artifacts from these sites have been subsequently stored at LANL. The LANL cultural resources group would now like to release these artifacts to the Museum of Indian Arts and Culture in Santa Fe for curation. The history of disposal at Area L suggests thatmore » the artifact pool is unlikely to be chemically contaminated and LANL staff washed each artifact at least once following excavation. Thus, it is unlikely that the artifacts present a chemical hazard. LANL’s Environmental Stewardship group (EPC-ES) has evaluated the radiological survey results for the Area L artifact pool and found that the items described in this report meet the criteria for unrestricted radiological release under Department of Energy (DOE) Order 458.1 Radiation Protection of the Public and the Environment and are candidates for release without restriction from LANL control. This conclusion is based on the known history of MDA L and on radiation survey data.« less
A hybrid intelligence approach to artifact recognition in digital publishing
NASA Astrophysics Data System (ADS)
Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.
2006-02-01
The system presented integrates rule-based and case-based reasoning for artifact recognition in Digital Publishing. In Variable Data Printing (VDP) human proofing could result prohibitive since a job could contain millions of different instances that may contain two types of artifacts: 1) evident defects, like a text overflow or overlapping 2) style-dependent artifacts, subtle defects that show as inconsistencies with regard to the original job design. We designed a Knowledge-Based Artifact Recognition tool for document segmentation, layout understanding, artifact detection, and document design quality assessment. Document evaluation is constrained by reference to one instance of the VDP job proofed by a human expert against the remaining instances. Fundamental rules of document design are used in the rule-based component for document segmentation and layout understanding. Ambiguities in the design principles not covered by the rule-based system are analyzed by case-based reasoning, using the Nearest Neighbor Algorithm, where features from previous jobs are used to detect artifacts and inconsistencies within the document layout. We used a subset of XSL-FO and assembled a set of 44 document samples. The system detected all the job layout changes, while obtaining an overall average accuracy of 84.56%, with the highest accuracy of 92.82%, for overlapping and the lowest, 66.7%, for the lack-of-white-space.
Metal artifact reduction using a patch-based reconstruction for digital breast tomosynthesis
NASA Astrophysics Data System (ADS)
Borges, Lucas R.; Bakic, Predrag R.; Maidment, Andrew D. A.; Vieira, Marcelo A. C.
2017-03-01
Digital breast tomosynthesis (DBT) is rapidly emerging as the main clinical tool for breast cancer screening. Although several reconstruction methods for DBT are described by the literature, one common issue is the interplane artifacts caused by out-of-focus features. For breasts containing highly attenuating features, such as surgical clips and large calcifications, the artifacts are even more apparent and can limit the detection and characterization of lesions by the radiologist. In this work, we propose a novel method of combining backprojected data into tomographic slices using a patch-based approach, commonly used in denoising. Preliminary tests were performed on a geometry phantom and on an anthropomorphic phantom containing metal inserts. The reconstructed images were compared to a commercial reconstruction solution. Qualitative assessment of the reconstructed images provides evidence that the proposed method reduces artifacts while maintaining low noise levels. Objective assessment supports the visual findings. The artifact spread function shows that the proposed method is capable of suppressing artifacts generated by highly attenuating features. The signal difference to noise ratio shows that the noise levels of the proposed and commercial methods are comparable, even though the commercial method applies post-processing filtering steps, which were not implemented on the proposed method. Thus, the proposed method can produce tomosynthesis reconstructions with reduced artifacts and low noise levels.
Accelerated Slice Encoding for Metal Artifact Correction
Hargreaves, Brian A.; Chen, Weitian; Lu, Wenmiao; Alley, Marcus T.; Gold, Garry E.; Brau, Anja C. S.; Pauly, John M.; Pauly, Kim Butts
2010-01-01
Purpose To demonstrate accelerated imaging with artifact reduction near metallic implants and different contrast mechanisms. Materials and Methods Slice-encoding for metal artifact correction (SEMAC) is a modified spin echo sequence that uses view-angle tilting and slice-direction phase encoding to correct both in-plane and through-plane artifacts. Standard spin echo trains and short-TI inversion recovery (STIR) allow efficient PD-weighted imaging with optional fat suppression. A completely linear reconstruction allows incorporation of parallel imaging and partial Fourier imaging. The SNR effects of all reconstructions were quantified in one subject. 10 subjects with different metallic implants were scanned using SEMAC protocols, all with scan times below 11 minutes, as well as with standard spin echo methods. Results The SNR using standard acceleration techniques is unaffected by the linear SEMAC reconstruction. In all cases with implants, accelerated SEMAC significantly reduced artifacts compared with standard imaging techniques, with no additional artifacts from acceleration techniques. The use of different contrast mechanisms allowed differentiation of fluid from other structures in several subjects. Conclusion SEMAC imaging can be combined with standard echo-train imaging, parallel imaging, partial-Fourier imaging and inversion recovery techniques to offer flexible image contrast with a dramatic reduction of metal-induced artifacts in scan times under 11 minutes. PMID:20373445
Accelerated slice encoding for metal artifact correction.
Hargreaves, Brian A; Chen, Weitian; Lu, Wenmiao; Alley, Marcus T; Gold, Garry E; Brau, Anja C S; Pauly, John M; Pauly, Kim Butts
2010-04-01
To demonstrate accelerated imaging with both artifact reduction and different contrast mechanisms near metallic implants. Slice-encoding for metal artifact correction (SEMAC) is a modified spin echo sequence that uses view-angle tilting and slice-direction phase encoding to correct both in-plane and through-plane artifacts. Standard spin echo trains and short-TI inversion recovery (STIR) allow efficient PD-weighted imaging with optional fat suppression. A completely linear reconstruction allows incorporation of parallel imaging and partial Fourier imaging. The signal-to-noise ratio (SNR) effects of all reconstructions were quantified in one subject. Ten subjects with different metallic implants were scanned using SEMAC protocols, all with scan times below 11 minutes, as well as with standard spin echo methods. The SNR using standard acceleration techniques is unaffected by the linear SEMAC reconstruction. In all cases with implants, accelerated SEMAC significantly reduced artifacts compared with standard imaging techniques, with no additional artifacts from acceleration techniques. The use of different contrast mechanisms allowed differentiation of fluid from other structures in several subjects. SEMAC imaging can be combined with standard echo-train imaging, parallel imaging, partial-Fourier imaging, and inversion recovery techniques to offer flexible image contrast with a dramatic reduction of metal-induced artifacts in scan times under 11 minutes. (c) 2010 Wiley-Liss, Inc.
Adib, Mani; Cretu, Edmond
2013-01-01
We present a new method for removing artifacts in electroencephalography (EEG) records during Galvanic Vestibular Stimulation (GVS). The main challenge in exploiting GVS is to understand how the stimulus acts as an input to brain. We used EEG to monitor the brain and elicit the GVS reflexes. However, GVS current distribution throughout the scalp generates an artifact on EEG signals. We need to eliminate this artifact to be able to analyze the EEG signals during GVS. We propose a novel method to estimate the contribution of the GVS current in the EEG signals at each electrode by combining time-series regression methods with wavelet decomposition methods. We use wavelet transform to project the recorded EEG signal into various frequency bands and then estimate the GVS current distribution in each frequency band. The proposed method was optimized using simulated signals, and its performance was compared to well-accepted artifact removal methods such as ICA-based methods and adaptive filters. The results show that the proposed method has better performance in removing GVS artifacts, compared to the others. Using the proposed method, a higher signal to artifact ratio of −1.625 dB was achieved, which outperformed other methods such as ICA-based methods, regression methods, and adaptive filters. PMID:23956786
Use of Video Goggles to Distract Patients During PET/CT Studies of School-Aged Children.
Gelfand, Michael J; Harris, Jennifer M; Rich, Amanda C; Kist, Chelsea S
2016-12-01
This study was designed to evaluate the effectiveness of video goggles in distracting children undergoing PET/CT and to determine whether the goggles create CT and PET artifacts. Video goggles with small amounts of internal radioopaque material were used. During whole-body PET/CT imaging, 30 nonsedated patients aged 4-13 y watched videos of their choice using the goggles. Fifteen of the PET/CT studies were performed on a scanner installed in 2006, and the other 15 were performed on a scanner installed in 2013. The fused scans were reviewed for evidence of head movement, and the individual PET and CT scans of the head were reviewed for the presence and severity of streak artifact. The CT exposure settings were recorded for each scan at the anatomic level at which the goggles were worn. Only one of the 30 scans had evidence of significant head motion. Two of the 30 had minor coregistration problems due to motion, and 27 of the 30 had very good to excellent coregistration. For the 2006 scanner, 2 of the 14 evaluable localization CT scans of the head demonstrated no streak artifact in brain tissue, 6 of the 14 had mild streak artifact in brain tissue, and 6 of the 14 had moderate streak artifact in brain tissue. Mild streak artifact in bone was noted in 2 of the 14 studies. For the 2013 scanner, 7 of 15 studies had mild streak artifact in brain tissue and 8 of 15 had no streak artifact in brain tissue, whereas none of the 15 had streak artifact in bone. There were no artifacts attributable to the goggles on the 18 F-FDG PET brain images of any of the 29 evaluable studies. The average CT exposure parameters at the level of the orbits were 36% lower on the 2013 scanner than on the 2006 scanner. Video goggles may be used successfully to distract children undergoing PET with localization CT. The goggles cause no significant degradation of the PET brain images or the CT skull images. The degree of artifact on brain tissue images varies from none to moderate and depends on the CT equipment used. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
A convolutional neural network-based screening tool for X-ray serial crystallography
Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.; Ushizima, Daniela; Yang, Chao; Sauter, Nicholas K.
2018-01-01
A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization. PMID:29714177
A convolutional neural network-based screening tool for X-ray serial crystallography.
Ke, Tsung Wei; Brewster, Aaron S; Yu, Stella X; Ushizima, Daniela; Yang, Chao; Sauter, Nicholas K
2018-05-01
A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization. open access.
A convolutional neural network-based screening tool for X-ray serial crystallography
Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.; ...
2018-04-24
A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization.
A convolutional neural network-based screening tool for X-ray serial crystallography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ke, Tsung-Wei; Brewster, Aaron S.; Yu, Stella X.
A new tool is introduced for screening macromolecular X-ray crystallography diffraction images produced at an X-ray free-electron laser light source. Based on a data-driven deep learning approach, the proposed tool executes a convolutional neural network to detect Bragg spots. Automatic image processing algorithms described can enable the classification of large data sets, acquired under realistic conditions consisting of noisy data with experimental artifacts. Outcomes are compared for different data regimes, including samples from multiple instruments and differing amounts of training data for neural network optimization.
A novel weighted-direction color interpolation
NASA Astrophysics Data System (ADS)
Tao, Jin-you; Yang, Jianfeng; Xue, Bin; Liang, Xiaofen; Qi, Yong-hong; Wang, Feng
2013-08-01
A digital camera capture images by covering the sensor surface with a color filter array (CFA), only get a color sample at pixel location. Demosaicking is a process by estimating the missing color components of each pixel to get a full resolution image. In this paper, a new algorithm based on edge adaptive and different weighting factors is proposed. Our method can effectively suppress undesirable artifacts. Experimental results based on Kodak images show that the proposed algorithm obtain higher quality images compared to other methods in numerical and visual aspects.
Lapierre-Landry, Maryse; Tucker-Schwartz, Jason M.; Skala, Melissa C.
2016-01-01
Photothermal OCT (PT-OCT) is an emerging molecular imaging technique that occupies a spatial imaging regime between microscopy and whole body imaging. PT-OCT would benefit from a theoretical model to optimize imaging parameters and test image processing algorithms. We propose the first analytical PT-OCT model to replicate an experimental A-scan in homogeneous and layered samples. We also propose the PT-CLEAN algorithm to reduce phase-accumulation and shadowing, two artifacts found in PT-OCT images, and demonstrate it on phantoms and in vivo mouse tumors. PMID:27446693
Confirming the Low-Mass, Sub-kpc Dual AGN Candidate in SDSS J0914+085
NASA Astrophysics Data System (ADS)
Gultekin, Kayhan
2016-09-01
The frequency of dual AGNs at low galaxy/black hole mass is poorly constrained. Thus we lack a full physical understanding of the connection between galaxy mergers and AGN activity and therefore merger-driven feedback. In particular, it is unknown whether or not LLAGN can be triggered by mergers instead of only by stochastic processes. We will address this with a 50 ksec observation to test for a dual AGN in SDSS J0914+0853, a low-mass (MBH 10^6.3), dual LLAGN candidate based on serendipitous, shallow Chandra imaging. The 15-ksec data showed two X-ray sources, but the nature of the secondary source is ambiguous because of 10% pile-up and potential PSF artifacts. With deeper, short-frame-rate Chandra observations at a new roll angle, we can unambiguously determine if the secondary is real.
Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG
McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.
2008-01-01
EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626
Hediger, Hedy; Stevens, Richard L.; Brandenberger, Hans; Schmid, Karl
1973-01-01
A new procedure for the qualitative and quantitative determination of asparagine, glutamine and pyrrolidonecarboxylic acid in total enzymic hydrolysates of peptides and glycopeptides based on g.l.c. has been developed. Under the conditions of esterification and trifluoroacetylation N-trifluoroacetylaspartic acid mono-n-butyl ester was formed from asparagine and N-trifluoroacetylglutamic acid mono-n-butyl ester from both glutamine and pyrrolidonecarboxylic acid. To distinguish between the latter two compounds, the esterification was carried out at room temperature yielding 30% of esterified pyrrolidonecarboxylic acid but less than 1% of esterified glutamine. In extending the g.l.c. of amino acids, the previously unknown positions in the g.l.c. elution pattern of the following amino acids could also be reproducibly determined: carboxymethylcysteine, homoserine, hydroxylysine and ∈-methyl-lysine. Further, certain glycopeptides were investigated and the artifacts due to their carbohydrate moieties were determined. PMID:4733240
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution
Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.
2013-01-01
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874
Acquiring an understanding of design: evidence from children's insight problem solving.
Defeyter, Margaret Anne; German, Tim P
2003-09-01
The human ability to make tools and use them to solve problems may not be zoologically unique, but it is certainly extraordinary. Yet little is known about the conceptual machinery that makes humans so competent at making and using tools. Do adults and children have concepts specialized for understanding human-made artifacts? If so, are these concepts deployed in attempts to solve novel problems? Here we present new data, derived from problem-solving experiments, which support the following. (i) The structure of the child's concept of artifact function changes profoundly between ages 5 and 7. At age 5, the child's conceptual machinery defines the function of an artifact as any goal a user might have; by age 7, its function is defined by the artifact's typical or intended use. (ii) This conceptual shift has a striking effect on problem-solving performance, i.e. the child's concept of artifact function appears to be deployed in problem solving. (iii) This effect on problem solving is not caused by differences in the amount of knowledge that children have about the typical use of a particular tool; it is mediated by the structure of the child's artifact concept (which organizes and deploys the child's knowledge). In two studies, children between 5 and 7 years of age were matched for their knowledge of what a particular artifact "is for", and then given a problem that can only be solved if that tool is used for an atypical purpose. All children performed well in a baseline condition. But when they were primed by a demonstration of the artifact's typical function, 5-year-old children solved the problem much faster than 6-7-year-old children. Because all children knew what the tools were for, differences in knowledge alone cannot explain the results. We argue that the older children were slower to solve the problem when the typical function was primed because (i) their artifact concept plays a role in problem solving, and (ii) intended purpose is central to their concept of artifact function, but not to that of the younger children.
Clustering-Constrained ICA for Ballistocardiogram Artifacts Removal in Simultaneous EEG-fMRI
Wang, Kai; Li, Wenjie; Dong, Li; Zou, Ling; Wang, Changming
2018-01-01
Combination of electroencephalogram (EEG) recording and functional magnetic resonance imaging (fMRI) plays a potential role in neuroimaging due to its high spatial and temporal resolution. However, EEG is easily influenced by ballistocardiogram (BCG) artifacts and may cause false identification of the related EEG features, such as epileptic spikes. There are many related methods to remove them, however, they do not consider the time-varying features of BCG artifacts. In this paper, a novel method using clustering algorithm to catch the BCG artifacts' features and together with the constrained ICA (ccICA) is proposed to remove the BCG artifacts. We first applied this method to the simulated data, which was constructed by adding the BCG artifacts to the EEG signal obtained from the conventional environment. Then, our method was tested to demonstrate the effectiveness during EEG and fMRI experiments on 10 healthy subjects. In simulated data analysis, the value of error in signal amplitude (Er) computed by ccICA method was lower than those from other methods including AAS, OBS, and cICA (p < 0.005). In vivo data analysis, the Improvement of Normalized Power Spectrum (INPS) calculated by ccICA method in all electrodes was much higher than AAS, OBS, and cICA methods (p < 0.005). We also used other evaluation index (e.g., power analysis) to compare our method with other traditional methods. In conclusion, our novel method successfully and effectively removed BCG artifacts in both simulated and vivo EEG data tests, showing the potentials of removing artifacts in EEG-fMRI applications. PMID:29487499
ARTIST: A fully automated artifact rejection algorithm for single-pulse TMS-EEG data.
Wu, Wei; Keller, Corey J; Rogasch, Nigel C; Longwell, Parker; Shpigel, Emmanuel; Rolle, Camarin E; Etkin, Amit
2018-04-01
Concurrent single-pulse TMS-EEG (spTMS-EEG) is an emerging noninvasive tool for probing causal brain dynamics in humans. However, in addition to the common artifacts in standard EEG data, spTMS-EEG data suffer from enormous stimulation-induced artifacts, posing significant challenges to the extraction of neural information. Typically, neural signals are analyzed after a manual time-intensive and often subjective process of artifact rejection. Here we describe a fully automated algorithm for spTMS-EEG artifact rejection. A key step of this algorithm is to decompose the spTMS-EEG data into statistically independent components (ICs), and then train a pattern classifier to automatically identify artifact components based on knowledge of the spatio-temporal profile of both neural and artefactual activities. The autocleaned and hand-cleaned data yield qualitatively similar group evoked potential waveforms. The algorithm achieves a 95% IC classification accuracy referenced to expert artifact rejection performance, and does so across a large number of spTMS-EEG data sets (n = 90 stimulation sites), retains high accuracy across stimulation sites/subjects/populations/montages, and outperforms current automated algorithms. Moreover, the algorithm was superior to the artifact rejection performance of relatively novice individuals, who would be the likely users of spTMS-EEG as the technique becomes more broadly disseminated. In summary, our algorithm provides an automated, fast, objective, and accurate method for cleaning spTMS-EEG data, which can increase the utility of TMS-EEG in both clinical and basic neuroscience settings. © 2018 Wiley Periodicals, Inc.
The interobserver-validated relevance of intervertebral spacer materials in MRI artifacting
Heidrich, G.; Bruening, T.; Krefft, S.; Buchhorn, G.; Klinger, H.M.
2006-01-01
Intervertebral spacers for anterior spine fusion are made of different materials, such as titanium, carbon or cobalt-chrome, which can affect the post-fusion MRI scans. Implant-related susceptibility artifacts can decrease the quality of MRI scans, thwarting proper evaluation. This cadaver study aimed to demonstrate the extent that implant-related MRI artifacting affects the post-fusion evaluation of intervertebral spacers. In a cadaveric porcine spine, we evaluated the post-implantation MRI scans of three intervertebral spacers that differed in shape, material, surface qualities and implantation technique. A spacer made of human cortical bone was used as a control. The median sagittal MRI slice was divided into 12 regions of interest (ROI). No significant differences were found on 15 different MRI sequences read independently by an interobserver-validated team of specialists (P>0.05). Artifact-affected image quality was rated on a score of 0-1-2. A maximum score of 24 points (100%) was possible. Turbo spin echo sequences produced the best scores for all spacers and the control. Only the control achieved a score of 100%. The carbon, titanium and cobalt-chrome spacers scored 83.3, 62.5 and 50%, respectively. Our scoring system allowed us to create an implant-related ranking of MRI scan quality in reference to the control that was independent of artifact dimensions. The carbon spacer had the lowest percentage of susceptibility artifacts. Even with turbo spin echo sequences, the susceptibility artifacts produced by the metallic spacers showed a high degree of variability. Despite optimum sequencing, implant design and material are relevant factors in MRI artifacting. PMID:16463200
Nuclear artifacts in gastric endoscopic submucosal dissection specimens: A clinicopathological study
MATSUKUMA, SUSUMU; TAKEO, HIROAKI; SATO, KIMIYA
2014-01-01
To delineate the characteristics of nuclear artifacts associated with endoscopic submucosal dissection (ESD), we examined 97 gastric ESD specimens from 79 patients. In 69 of the specimens (71%), multinucleated figures and/or atypical mitotic-like figures, including tripolar-like and bizarre spindles, were found in the peripheral portions close to the marking areas. These nuclear figures were mostly recognizable as artifacts, but were infrequently (13 specimens) accompanied by other nuclear alterations and/or architectural abnormalities, mimicking dysplasia. However, in the deep cut sections, the dysplastic characteristics tended to disappear and coagulative or degenerative findings became more prominent. These nuclear artifacts were not found in 69 age- and gender-matched control gastrectomy specimens without ESD. Multinucleated artifacts were associated with the size of the ESD specimens (P=0.003), frequency of marking (P<0.001) and a history of ‘previous’ marking 1–6 days prior to ESD (P<0.001); however, they were not associated with age, ESD procedure time, or ‘fresh’ marking on the day of the ESD. Atypical mitosis-like characteristics were associated with a history of ‘fresh’ (P=0.007) as well as ‘previous’ (P=0.002) marking, but not with other variables. Dysplasia-like artifacts were associated with older age only (P=0.031). Follow-up data of all the patients with nuclear artifacts showed no aggressive behavior. Therefore, we concluded that these nuclear changes were ESD-related artifacts. Particularly in older patients, these changes may simulate dysplasia and must be distinguished from true dysplasia or neoplasia. PMID:25054062