Intelligent Agents for Design and Synthesis Environments: My Summary
NASA Technical Reports Server (NTRS)
Norvig, Peter
1999-01-01
This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Assessing students' performance in software requirements engineering education using scoring rubrics
NASA Astrophysics Data System (ADS)
Mkpojiogu, Emmanuel O. C.; Hussain, Azham
2017-10-01
The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.
A Software Architecture for Intelligent Synthesis Environments
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Norvig, Peter (Technical Monitor)
2001-01-01
The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.
A workflow learning model to improve geovisual analytics utility
Roth, Robert E; MacEachren, Alan M; McCabe, Craig A
2011-01-01
Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545
A workflow learning model to improve geovisual analytics utility.
Roth, Robert E; Maceachren, Alan M; McCabe, Craig A
2009-01-01
INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.
Yuki, I; Kambayashi, Y; Ikemura, A; Abe, Y; Kan, I; Mohamed, A; Dahmani, C; Suzuki, T; Ishibashi, T; Takao, H; Urashima, M; Murayama, Y
2016-02-01
Combination of high-resolution C-arm CT and novel metal artifact reduction software may contribute to the assessment of aneurysms treated with stent-assisted coil embolization. This study aimed to evaluate the efficacy of a novel Metal Artifact Reduction prototype software combined with the currently available high spatial-resolution C-arm CT prototype implementation by using an experimental aneurysm model treated with stent-assisted coil embolization. Eight experimental aneurysms were created in 6 swine. Coil embolization of each aneurysm was performed by using a stent-assisted technique. High-resolution C-arm CT with intra-arterial contrast injection was performed immediately after the treatment. The obtained images were processed with Metal Artifact Reduction. Five neurointerventional specialists reviewed the image quality before and after Metal Artifact Reduction. Observational and quantitative analyses (via image analysis software) were performed. Every aneurysm was successfully created and treated with stent-assisted coil embolization. Before Metal Artifact Reduction, coil loops protruding through the stent lumen were not visualized due to the prominent metal artifacts produced by the coils. These became visible after Metal Artifact Reduction processing. Contrast filling in the residual aneurysm was also visualized after Metal Artifact Reduction in every aneurysm. Both the observational (P < .0001) and quantitative (P < .001) analyses showed significant reduction of the metal artifacts after application of the Metal Artifact Reduction prototype software. The combination of high-resolution C-arm CT and Metal Artifact Reduction enables differentiation of the coil mass, stent, and contrast material on the same image by significantly reducing the metal artifacts produced by the platinum coils. This novel image technique may improve the assessment of aneurysms treated with stent-assisted coil embolization. © 2016 by American Journal of Neuroradiology.
ERIC Educational Resources Information Center
Zheng, Yongjie
2012-01-01
Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
CHIME: A Metadata-Based Distributed Software Development Environment
2005-01-01
structures by using typography , graphics , and animation. The Software Im- mersion in our conceptual model for CHIME can be seen as a form of Software...Even small- to medium-sized development efforts may involve hundreds of artifacts -- design documents, change requests, test cases and results, code...for managing and organizing information from all phases of the software lifecycle. CHIME is designed around an XML-based metadata architecture, in
On Open and Collaborative Software Development in the DoD
2010-04-30
of this community and the larger F/OSS communities to make changes (and commit those changes) to the artifact base. This churning effect...Succinctly, it is this churning and frequent updates (i.e., "release early, release often") to the artifacts that spark innovation through...the entire project. Artifacts are frequently updated and churned over by the F/OSS community, resulting in better quality and innovation. It is up
Tool Use Within NASA Software Quality Assurance
NASA Technical Reports Server (NTRS)
Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel
2013-01-01
As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.
Component Models for Semantic Web Languages
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Aßmann, Uwe
Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.
A study on a pedicle-screw-based reduction method for artificially reduced artifacts
NASA Astrophysics Data System (ADS)
Kim, Hyun-Ju; Lee, Hae-Kag; Cho, Jae-Hwan
2017-09-01
The purpose of this study is a quantitative analysis of the degree of the reduction of the artifacts that are induced by pedicle screws through the application of the recently developed iterative metallic artifact reduction (I MAR) software. Screw-type implants that are composed of 4.5 g/cm3 titanium (Ti) with an approximate average computed tomography (CT) value of 6500 Hounsfield units (HUs) that are used for the treatment of spinal diseases were placed in paraffin, a tissueequivalent material, and then dried. After the insertion, the scanning conditions were fixed as 120 kVp and 250 mA using multidetector computed tomography (MDCT) (Enlarge, Siemens, Germany). The slice thickness and the increment were set at the fields of view (FOVs) of 3 mm and 120 mm, respectively; the pitch is 0.8; the rotation time is 1 s; and the I MAR software was applied to the raw data of the acquired images to compare the CT-value changes of the posterior images. When the I MAR software was applied to animal vertebrae, it was possible to reduce the 65.7% image loss of the black-hole-effect image through the application of the I MAR software. When the I MAR image loss (%) was compared with the white-streak-effect image, the high-intensity image type with the white-streak effect could be reduced by 91.34% through the application of the I MAR software. In conclusion, a metal artifact that is due to a high-density material can be reduced more effectively when the I MAR algorithm is applied compared with that from the application of the conventional MAR algorithm. The I MAR can provide information on the various tissues that form around the artifact and the reduced metal structures, which can be helpful for radiologists and clinicians in their determination of an accurate diagnosis.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
A software development and evolution model based on decision-making
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
1991-01-01
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
IMART software for correction of motion artifacts in images collected in intravital microscopy
Dunn, Kenneth W; Lorenz, Kevin S; Salama, Paul; Delp, Edward J
2014-01-01
Intravital microscopy is a uniquely powerful tool, providing the ability to characterize cell and organ physiology in the natural context of the intact, living animal. With the recent development of high-resolution microscopy techniques such as confocal and multiphoton microscopy, intravital microscopy can now characterize structures at subcellular resolution and capture events at sub-second temporal resolution. However, realizing the potential for high resolution requires remarkable stability in the tissue. Whereas the rigid structure of the skull facilitates high-resolution imaging of the brain, organs of the viscera are free to move with respiration and heartbeat, requiring additional apparatus for immobilization. In our experience, these methods are variably effective, so that many studies are compromised by residual motion artifacts. Here we demonstrate the use of IMART, a software tool for removing motion artifacts from intravital microscopy images collected in time series or in three dimensions. PMID:26090271
Software electron counting for low-dose scanning transmission electron microscopy.
Mittelberger, Andreas; Kramberger, Christian; Meyer, Jannik C
2018-05-01
The performance of the detector is of key importance for low-dose imaging in transmission electron microscopy, and counting every single electron can be considered as the ultimate goal. In scanning transmission electron microscopy, low-dose imaging can be realized by very fast scanning, however, this also introduces artifacts and a loss of resolution in the scan direction. We have developed a software approach to correct for artifacts introduced by fast scans, making use of a scintillator and photomultiplier response that extends over several pixels. The parameters for this correction can be directly extracted from the raw image. Finally, the images can be converted into electron counts. This approach enables low-dose imaging in the scanning transmission electron microscope via high scan speeds while retaining the image quality of artifact-free slower scans. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Suppression of stimulus artifact contaminating electrically evoked electromyography.
Liu, Jie; Li, Sheng; Li, Xiaoyan; Klein, Cliff; Rymer, William Z; Zhou, Ping
2014-01-01
Electrical stimulation of muscle or nerve is a very useful technique for understanding of muscle activity and its pathological changes for both diagnostic and therapeutic purposes. During electrical stimulation of a muscle, the recorded M wave is often contaminated by a stimulus artifact. The stimulus artifact must be removed for appropriate analysis and interpretation of M waves. The objective of this study was to develop a novel software based method to remove stimulus artifacts contaminating or superimposing with electrically evoked surface electromyography (EMG) or M wave signals. The multiple stage method uses a series of signal processing techniques, including highlighting and detection of stimulus artifacts using Savitzky-Golay filtering, estimation of the artifact contaminated region with Otsu thresholding, and reconstruction of such region using signal interpolation and smoothing. The developed method was tested using M wave signals recorded from biceps brachii muscles by a linear surface electrode array. To evaluate the performance, a series of semi-synthetic signals were constructed from clean M wave and stimulus artifact recordings with different degrees of overlap between them. The effectiveness of the developed method was quantified by a significant increase in correlation coefficient and a significant decrease in root mean square error between the clean M wave and the reconstructed M wave, compared with those between the clean M wave and the originally contaminated signal. The validity of the developed method was also demonstrated when tested on each channel's M wave recording using a linear electrode array. The developed method can suppress stimulus artifacts contaminating M wave recordings.
Towards automated traceability maintenance
Mäder, Patrick; Gotel, Orlena
2012-01-01
Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308
GSC configuration management plan
NASA Technical Reports Server (NTRS)
Withers, B. Edward
1990-01-01
The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
Suppression of Stimulus Artifact Contaminating Electrically Evoked Electromyography
Liu, Jie; Li, Sheng; Li, Xiaoyan; Klein, Cliff; Rymer, William Z.; Zhou, Ping
2013-01-01
Background Electrical stimulation of muscle or nerve is a very useful technique for understanding of muscle activity and its pathological changes for both diagnostic and therapeutic purposes. During electrical stimulation of a muscle, the recorded M wave is often contaminated by a stimulus artifact. The stimulus artifact must be removed for appropriate analysis and interpretation of M waves. Objectives The objective of this study was to develop a novel software based method to remove stimulus artifacts contaminating or superimposing with electrically evoked surface electromyography (EMG) or M wave signals. Methods The multiple stage method uses a series of signal processing techniques, including highlighting and detection of stimulus artifacts using the Savitzky-Golay filtering, estimation of the artifact contaminated region with the Otsu thresholding, and reconstruction of such region using signal interpolation and smoothing. The developed method was tested using M wave signals recorded from biceps brachii muscles by a linear surface electrode array. To evaluate the performance, a series of semi-synthetic signals were constructed from clean M wave and stimulus artifact recordings with different degrees of overlap between them. Results The effectiveness of the developed method was quantified by a significant increase in correlation coefficient and a significant decrease in root mean square error between the clean M wave and the reconstructed M wave, compared with those between the clean M wave and the originally contaminated signal. The validity of the developed method was also demonstrated when tested on each channel’s M wave recording using the linear electrode array. Conclusions The developed method can suppress stimulus artifacts contaminating M wave recordings. PMID:24419021
Shinohara, Yuki; Sakamoto, Makoto; Iwata, Naoki; Kishimoto, Junichi; Kuya, Keita; Fujii, Shinya; Kaminou, Toshio; Watanabe, Takashi; Ogawa, Toshihide
2014-10-01
Recently, a newly developed fast-kV switching dual energy CT scanner with a gemstone detector generates virtual high keV images as monochromatic imaging (MI). Each MI can be reconstructed by metal artifact reduction software (MARS) to reduce metal artifact. To evaluate the degree of metal artifacts reduction and vessel visualization around the platinum coils using dual energy CT with MARS. Dual energy CT was performed using a Discovery CT750 HD scanner (GE Healthcare, Milwaukee, WI, USA). In a phantom study, we measured the mean standard deviation within regions of interest around a 10-mm-diameter platinum coil mass on MI with and without MARS. Thirteen patients who underwent CTA after endovascular embolization for cerebral aneurysm with platinum coils were included in a clinical study. We visually assessed the arteries around the platinum coil mass on MI with and without MARS. Each standard deviation near the coil mass on MI with MARS was significantly lower than that without MARS in a phantom study. On CTA of a clinical study, better visibility of neighboring arteries was obtained in 11 of 13 patients on MI with MARS compared to without MARS due to metal artifact reduction. Dual energy CT with MARS reduces metal artifact of platinum coils, resulting in favorable vessel visualization around the coil mass on CTA after embolization. © The Foundation Acta Radiologica 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults
NASA Technical Reports Server (NTRS)
Hamill, Maggie; Goseva-Popstojanova, Katerina
2016-01-01
Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
Systems engineering: A formal approach. Part 1: System concepts
NASA Astrophysics Data System (ADS)
Vanhee, K. M.
1993-03-01
Engineering is the scientific discipline focused on the creation of new artifacts that are supposed to be of some use to our society. Different types of artifacts require different engineering approaches. However, in all these disciplines the development of a new artifact is divided into stages. Three stages can always be recognized: Analysis, Design, and Realization. The book considers only the first two stages of the development process. It focuses on a specific type of artifacts, called discrete dynamic systems. These systems consist of active components of actors that consume and produce passive components or tokens. Three subtypes are studied in more detail: business systems (like a factory or restaurant), information systems (whether automated or not), and automated systems (systems that are controlled by an automated information system). The first subtype is studied by industrial engineers, the last by software engineers and electrical engineers, whereas the second is a battlefield for all three disciplines. The union of these disciplines is called systems engineering.
Quality Assurance in the Presence of Variability
NASA Astrophysics Data System (ADS)
Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus
Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jani, S
Purpose: CT simulation for patients with metal implants can often be challenging due to artifacts that obscure tumor/target delineation and normal organ definition. Our objective was to evaluate the effectiveness of Orthopedic Metal Artifact Reduction (OMAR), a commercially available software, in reducing metal-induced artifacts and its effect on computed dose during treatment planning. Methods: CT images of water surrounding metallic cylindrical rods made of aluminum, copper and iron were studied in terms of Hounsfield Units (HU) spread. Metal-induced artifacts were characterized in terms of HU/Volume Histogram (HVH) using the Pinnacle treatment planning system. Effects of OMAR on enhancing our abilitymore » to delineate organs on CT and subsequent dose computation were examined in nine (9) patients with hip implants and two (2) patients with breast tissue expanders. Results: Our study characterized water at 1000 HU with a standard deviation (SD) of about 20 HU. The HVHs allowed us to evaluate how the presence of metal changed the HU spread. For example, introducing a 2.54 cm diameter copper rod in water increased the SD in HU of the surrounding water from 20 to 209, representing an increase in artifacts. Subsequent use of OMAR brought the SD down to 78. Aluminum produced least artifacts whereas Iron showed largest amount of artifacts. In general, an increase in kVp and mA during CT scanning showed better effectiveness of OMAR in reducing artifacts. Our dose analysis showed that some isodose contours shifted by several mm with OMAR but infrequently and were nonsignificant in planning process. Computed volumes of various dose levels showed <2% change. Conclusions: In our experience, OMAR software greatly reduced the metal-induced CT artifacts for the majority of patients with implants, thereby improving our ability to delineate tumor and surrounding organs. OMAR had a clinically negligible effect on computed dose within tissues. Partially funded by unrestricted educational grant from Philips.« less
ERIC Educational Resources Information Center
Reyes Alamo, Jose M.
2010-01-01
The Service Oriented Computing (SOC) paradigm, defines services as software artifacts whose implementations are separated from their specifications. Application developers rely on services to simplify the design, reduce the development time and cost. Within the SOC paradigm, different Service Oriented Architectures (SOAs) have been developed.…
Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing
NASA Technical Reports Server (NTRS)
Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.
2010-01-01
The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.
Usability and Culture as Two of the Value Criteria for Evaluating the Artifact
NASA Astrophysics Data System (ADS)
Kurosu, Masaaki
In this paper, the conceptual framework of the Artifact Development Analysis (ADA) and its relationship to the usability engineering are outlined. The ADA analyses the significance of all artifacts including hardware, software, humanware and system. Its viewpoint extends both in temporal and spatial dimensions. In short, it deals with the diversity of the artifact and casts the questions "why it is so" and "why it is not so". In this respect, the ADA is related to the usability engineering as one of the value attitudes. The usability engineering puts emphasis on effectiveness and efficiency. The usability is not always the value criterion of highest importance and some people sometimes put more emphasis on other criteria such as the aesthetic aspect, the cost, etc. Based on the findings of ADA, we should focus on the extent where the usability can provide the core satisfaction and we should also summarize the guideline on how the artifact should be designed.
de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana
2017-06-01
The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.
De Crop, An; Casselman, Jan; Van Hoof, Tom; Dierens, Melissa; Vereecke, Elke; Bossu, Nicolas; Pamplona, Jaime; D'Herde, Katharina; Thierens, Hubert; Bacher, Klaus
2015-08-01
Metal artifacts may negatively affect radiologic assessment in the oral cavity. The aim of this study was to evaluate different metal artifact reduction techniques for metal artifacts induced by dental hardware in CT scans of the oral cavity. Clinical image quality was assessed using a Thiel-embalmed cadaver. A Catphan phantom and a polymethylmethacrylate (PMMA) phantom were used to evaluate physical-technical image quality parameters such as artifact area, artifact index (AI), and contrast detail (IQFinv). Metal cylinders were inserted in each phantom to create metal artifacts. CT images of both phantoms and the Thiel-embalmed cadaver were acquired on a multislice CT scanner using 80, 100, 120, and 140 kVp; model-based iterative reconstruction (Veo); and synthesized monochromatic keV images with and without metal artifact reduction software (MARs). Four radiologists assessed the clinical image quality, using an image criteria score (ICS). Significant influence of increasing kVp and the use of Veo was found on clinical image quality (p = 0.007 and p = 0.014, respectively). Application of MARs resulted in a smaller artifact area (p < 0.05). However, MARs reconstructed images resulted in lower ICS. Of all investigated techniques, Veo shows to be most promising, with a significant improvement of both the clinical and physical-technical image quality without adversely affecting contrast detail. MARs reconstruction in CT images of the oral cavity to reduce dental hardware metallic artifacts is not sufficient and may even adversely influence the image quality.
Han, Seung Chol; Chung, Yong Eun; Lee, Young Han; Park, Kwan Kyu; Kim, Myeong Jin; Kim, Ki Whang
2014-10-01
The objective of our study was to determine the feasibility of using Metal Artifact Reduction (MAR) software for abdominopelvic dual-energy CT in patients with metal hip prostheses. This retrospective study included 33 patients (male-female ratio, 19:14; mean age, 63.7 years) who received total hip replacements and 20 patients who did not have metal prostheses as the control group. All of the patients underwent dual-energy CT. The quality of the images reconstructed using the MAR algorithm and of those reconstructed using the standard reconstruction was evaluated in terms of the visibility of the bladder wall, pelvic sidewall, rectal shelf, and bone-prosthesis interface and the overall diagnostic image quality with a 4-point scale. The mean and SD attenuation values in Hounsfield units were measured in the bladder, pelvic sidewall, and rectal shelf. For validation of the MAR interpolation algorithm, pelvis phantoms with small bladder "lesions" and metal hip prostheses were made, and images of the phantoms both with and without MAR reconstruction were evaluated. Image quality was significantly better with MAR reconstruction than without at all sites except the rectal shelf, where the image quality either had not changed or had worsened after MAR reconstruction. The mean attenuation value was changed after MAR reconstruction to its original expected value at the pelvic sidewall (p < 0.001) and inside the bladder (p < 0.001). The SD attenuation value was significantly decreased after MAR reconstruction at the pelvic sidewall (p = 0.019) but did not show significant differences at the bladder (p = 0.173) or rectal shelf (p = 0.478). In the phantom study, all lesions obscured by metal artifacts on the standard reconstruction images were visualized after MAR reconstruction; however, new artifacts had developed in other parts of the MAR reconstruction images. The use of MAR software with dual-energy CT decreases metal artifacts and increases diagnostic confidence in the assessment of the pelvic cavity but also introduces new artifacts that can obscure pelvic structures.
Software Reviews Since Acquisition Reform - The Artifact Perspective
2004-01-01
Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, K; Kuo, H; Ritter, J
Purpose: To evaluate the feasibility of using a metal artifact reduction technique in depleting metal artifact and its application in improving dose calculation in External Radiation Therapy Planning. Methods: CIRS electron density phantom was scanned with and without steel drill bits placed in some plug holes. Meta artifact reduction software with Metal Deletion Technique (MDT) was used to remove metal artifacts for scanned image with metal. Hounsfield units of electron density plugs from artifact free reference image and MDT processed images were compared. To test the dose calculation improvement after the MDT processed images, clinically approved head and neck planmore » with manual dental artifact correction was tested. Patient images were exported and processed with MDT and plan was recalculated with new MDT image without manual correction. Dose profiles near the metal artifacts were compared. Results: The MDT used in this study effectively reduced the metal artifact caused by beam hardening and scatter. The windmill around the metal drill was greatly improved with smooth rounded view. Difference of the mean HU in each density plug between reference and MDT images were less than 10 HU in most of the plugs. Dose difference between original plan and MDT images were minimal. Conclusion: Most metal artifact reduction methods were developed for diagnostic improvement purpose. Hence Hounsfield unit accuracy was not rigorously tested before. In our test, MDT effectively eliminated metal artifacts with good HU reproduciblity. However, it can introduce new mild artifacts so the MDT images should be checked with original images.« less
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
Software design by reusing architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay; Nii, H. Penny
1992-01-01
Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.
Model Driven Engineering with Ontology Technologies
NASA Astrophysics Data System (ADS)
Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva
Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Conjunctive programming: An interactive approach to software system synthesis
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1992-01-01
This report introduces a technique of software documentation called conjunctive programming and discusses its role in the development and maintenance of software systems. The report also describes the conjoin tool, an adjunct to assist practitioners. Aimed at supporting software reuse while conforming with conventional development practices, conjunctive programming is defined as the extraction, integration, and embellishment of pertinent information obtained directly from an existing database of software artifacts, such as specifications, source code, configuration data, link-edit scripts, utility files, and other relevant information, into a product that achieves desired levels of detail, content, and production quality. Conjunctive programs typically include automatically generated tables of contents, indexes, cross references, bibliographic citations, tables, and figures (including graphics and illustrations). This report presents an example of conjunctive programming by documenting the use and implementation of the conjoin program.
Ohira, Shingo; Kanayama, Naoyuki; Wada, Kentaro; Karino, Tsukasa; Nitta, Yuya; Ueda, Yoshihiro; Miyazaki, Masayoshi; Koizumi, Masahiko; Teshima, Teruki
2018-04-02
The objective of this study was to assess the accuracy of the quantitative measurements obtained using dual-energy computed tomography with metal artifact reduction software (MARS). Dual-energy computed tomography scans (fast kV-switching) are performed on a phantom, by varying the number of metal rods (Ti and Pb) and reference iodine materials. Objective and subjective image analyses are performed on retroreconstructed virtual monochromatic images (VMIs) (VMI at 70 keV). The maximum artifact indices for VMI-Ti and VMI-Pb (5 metal rods) with MARS (without MARS) were 17.4 (166.7) and 34.6 (810.6), respectively; MARS significantly improved the mean subjective 5-point score (P < 0.05). The maximum differences between the measured Hounsfield unit and theoretical values for 5 mg/mL iodine and 2-mm core rods were -42.2% and -68.5%, for VMI-Ti and VMI-Pb (5 metal rods), respectively, and the corresponding differences in the iodine concentration were -64.7% and -73.0%, respectively. Metal artifact reduction software improved the objective and subjective image quality; however, the quantitative values were underestimated.
Evaluation of image registration in PET/CT of the liver and recommendations for optimized imaging.
Vogel, Wouter V; van Dalen, Jorn A; Wiering, Bas; Huisman, Henkjan; Corstens, Frans H M; Ruers, Theo J M; Oyen, Wim J G
2007-06-01
Multimodality PET/CT of the liver can be performed with an integrated (hybrid) PET/CT scanner or with software fusion of dedicated PET and CT. Accurate anatomic correlation and good image quality of both modalities are important prerequisites, regardless of the applied method. Registration accuracy is influenced by breathing motion differences on PET and CT, which may also have impact on (attenuation correction-related) artifacts, especially in the upper abdomen. The impact of these issues was evaluated for both hybrid PET/CT and software fusion, focused on imaging of the liver. Thirty patients underwent hybrid PET/CT, 20 with CT during expiration breath-hold (EB) and 10 with CT during free breathing (FB). Ten additional patients underwent software fusion of dedicated PET and dedicated expiration breath-hold CT (SF). The image registration accuracy was evaluated at the location of liver borders on CT and uncorrected PET images and at the location of liver lesions. Attenuation-correction artifacts were evaluated by comparison of liver borders on uncorrected and attenuation-corrected PET images. CT images were evaluated for the presence of breathing artifacts. In EB, 40% of patients had an absolute registration error of the diaphragm in the craniocaudal direction of >1 cm (range, -16 to 44 mm), and 45% of lesions were mispositioned >1 cm. In 50% of cases, attenuation-correction artifacts caused a deformation of the liver dome on PET of >1 cm. Poor compliance to breath-hold instructions caused CT artifacts in 55% of cases. In FB, 30% had registration errors of >1 cm (range, -4 to 16 mm) and PET artifacts were less extensive, but all CT images had breathing artifacts. As SF allows independent alignment of PET and CT, no registration errors or artifacts of >1 cm of the diaphragm occurred. Hybrid PET/CT of the liver may have significant registration errors and artifacts related to breathing motion. The extent of these issues depends on the selected breathing protocol and the speed of the CT scanner. No protocol or scanner can guarantee perfect image fusion. On the basis of these findings, recommendations were formulated with regard to scanner requirements, breathing protocols, and reporting.
Cha, Jihoon; Kim, Hyung-Jin; Kim, Sung Tae; Kim, Yi Kyung; Kim, Ha Youn; Park, Gyeong Min
2017-11-01
Background Metallic dental prostheses may degrade image quality on head and neck computed tomography (CT). However, there is little information available on the use of dual-energy CT (DECT) and metal artifact reduction software (MARS) in the head and neck regions to reduce metallic dental artifacts. Purpose To assess the usefulness of DECT with virtual monochromatic imaging and MARS to reduce metallic dental artifacts. Material and Methods DECT was performed using fast kilovoltage (kV)-switching between 80-kV and 140-kV in 20 patients with metallic dental prostheses. CT data were reconstructed with and without MARS, and with synthesized monochromatic energy in the range of 40-140-kiloelectron volt (keV). For quantitative analysis, the artifact index of the tongue, buccal, and parotid areas was calculated for each scan. For qualitative analysis, two radiologists evaluated 70-keV and 100-keV images with and without MARS for tongue, buccal, parotid areas, and metallic denture. The locations and characteristics of the MARS-related artifacts, if any, were also recorded. Results DECT with MARS markedly reduced metallic dental artifacts and improved image quality in the buccal area ( P < 0.001) and the tongue ( P < 0.001), but not in the parotid area. The margin and internal architecture of the metallic dentures were more clearly delineated with MARS ( P < 0.001) and in the higher-energy images than in the lower-energy images ( P = 0.042). MARS-related artifacts most commonly occurred in the deep center of the neck. Conclusion DECT with MARS can reduce metallic dental artifacts and improve delineation of the metallic prosthesis and periprosthetic region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, P; Cheng, S; Chao, C
Purpose: Respiratory motion artifacts are commonly seen in the abdominal and thoracic CT images. A Real-time Position Management (RPM) system is integrated with CT simulator using abdominal surface as a surrogate for tracking the patient respiratory motion. The respiratory-correlated four-dimensional computed tomography (4DCT) is then reconstructed by GE advantage software. However, there are still artifacts due to inaccurate respiratory motion detecting and sorting methods. We developed an Ultrasonography Respiration Monitoring (URM) system which can directly monitor diaphragm motion to detect respiratory cycles. We also developed a new 4DCT sorting and motion estimation method to reduce the respiratory motion artifacts. Themore » new 4DCT system was compared with RPM and the GE 4DCT system. Methods: Imaging from a GE CT scanner was simultaneously correlated with both the RPM and URM to detect respiratory motion. A radiation detector, Blackcat GM-10, recorded the X-ray on/off and synchronized with URM. The diaphragm images were acquired with Ultrasonix RP system. The respiratory wave was derived from diaphragm images and synchronized with CT scanner. A more precise peaks and valleys detection tool was developed and compared with RPM. The motion is estimated for the slices which are not in the predefined respiratory phases by using block matching and optical flow method. The CT slices were then sorted into different phases and reconstructed, compared with the images reconstructed from GE Advantage software using respiratory wave produced from RPM system. Results: The 4DCT images were reconstructed for eight patients. The discontinuity at the diaphragm level due to an inaccurate identification of phases by the RPM was significantly improved by URM system. Conclusion: Our URM 4DCT system was evaluated and compared with RPM and GE 4DCT system. The new system is user friendly and able to reduce motion artifacts. It also has the potential to monitor organ motion during therapy.« less
A simple system for detection of EEG artifacts in polysomnographic recordings.
Durka, P J; Klekowicz, H; Blinowska, K J; Szelenberger, W; Niemcewicz, Sz
2003-04-01
We present an efficient parametric system for automatic detection of electroencephalogram (EEG) artifacts in polysomnographic recordings. For each of the selected types of artifacts, a relevant parameter was calculated for a given epoch. If any of these parameters exceeded a threshold, the epoch was marked as an artifact. Performance of the system, evaluated on 18 overnight polysomnographic recordings, revealed concordance with decisions of human experts close to the interexpert agreement and the repeatability of expert's decisions, assessed via a double-blind test. Complete software (Matlab source code) for the presented system is freely available from the Internet at http://brain.fuw.edu.pl/artifacts.
Robust water fat separated dual-echo MRI by phase-sensitive reconstruction.
Romu, Thobias; Dahlström, Nils; Leinhard, Olof Dahlqvist; Borga, Magnus
2017-09-01
The purpose of this work was to develop and evaluate a robust water-fat separation method for T1-weighted symmetric two-point Dixon data. A method for water-fat separation by phase unwrapping of the opposite-phase images by phase-sensitive reconstruction (PSR) is introduced. PSR consists of three steps; (1), identification of clusters of tissue voxels; (2), unwrapping of the phase in each cluster by solving Poisson's equation; and (3), finding the correct sign of each unwrapped opposite-phase cluster, so that the water-fat images are assigned the correct identities. Robustness was evaluated by counting the number of water-fat swap artifacts in a total of 733 image volumes. The method was also compared to commercial software. In the water-fat separated image volumes, the PSR method failed to unwrap the phase of one cluster and misclassified 10. One swap was observed in areas affected by motion and was constricted to the affected area. Twenty swaps were observed surrounding susceptibility artifacts, none of which spread outside the artifact affected regions. The PSR method had fewer swaps when compared to commercial software. The PSR method can robustly produce water-fat separated whole-body images based on symmetric two-echo spoiled gradient echo images, under both ideal conditions and in the presence of common artifacts. Magn Reson Med 78:1208-1216, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Temporal motifs reveal collaboration patterns in online task-oriented networks
NASA Astrophysics Data System (ADS)
Xuan, Qi; Fang, Huiting; Fu, Chenbo; Filkov, Vladimir
2015-05-01
Real networks feature layers of interactions and complexity. In them, different types of nodes can interact with each other via a variety of events. Examples of this complexity are task-oriented social networks (TOSNs), where teams of people share tasks towards creating a quality artifact, such as academic research papers or software development in commercial or open source environments. Accomplishing those tasks involves both work, e.g., writing the papers or code, and communication, to discuss and coordinate. Taking into account the different types of activities and how they alternate over time can result in much more precise understanding of the TOSNs behaviors and outcomes. That calls for modeling techniques that can accommodate both node and link heterogeneity as well as temporal change. In this paper, we report on methodology for finding temporal motifs in TOSNs, limited to a system of two people and an artifact. We apply the methods to publicly available data of TOSNs from 31 Open Source Software projects. We find that these temporal motifs are enriched in the observed data. When applied to software development outcome, temporal motifs reveal a distinct dependency between collaboration and communication in the code writing process. Moreover, we show that models based on temporal motifs can be used to more precisely relate both individual developer centrality and team cohesion to programmer productivity than models based on aggregated TOSNs.
Temporal motifs reveal collaboration patterns in online task-oriented networks.
Xuan, Qi; Fang, Huiting; Fu, Chenbo; Filkov, Vladimir
2015-05-01
Real networks feature layers of interactions and complexity. In them, different types of nodes can interact with each other via a variety of events. Examples of this complexity are task-oriented social networks (TOSNs), where teams of people share tasks towards creating a quality artifact, such as academic research papers or software development in commercial or open source environments. Accomplishing those tasks involves both work, e.g., writing the papers or code, and communication, to discuss and coordinate. Taking into account the different types of activities and how they alternate over time can result in much more precise understanding of the TOSNs behaviors and outcomes. That calls for modeling techniques that can accommodate both node and link heterogeneity as well as temporal change. In this paper, we report on methodology for finding temporal motifs in TOSNs, limited to a system of two people and an artifact. We apply the methods to publicly available data of TOSNs from 31 Open Source Software projects. We find that these temporal motifs are enriched in the observed data. When applied to software development outcome, temporal motifs reveal a distinct dependency between collaboration and communication in the code writing process. Moreover, we show that models based on temporal motifs can be used to more precisely relate both individual developer centrality and team cohesion to programmer productivity than models based on aggregated TOSNs.
A first-generation software product line for data acquisition systems in astronomy
NASA Astrophysics Data System (ADS)
López-Ruiz, J. C.; Heradio, Rubén; Cerrada Somolinos, José Antonio; Coz Fernandez, José Ramón; López Ramos, Pablo
2008-07-01
This article presents a case study on developing a software product line for data acquisition systems in astronomy based on the Exemplar Driven Development methodology and the Exemplar Flexibilization Language tool. The main strategies to build the software product line are based on the domain commonality and variability, the incremental scope and the use of existing artifacts. It consists on a lean methodology with little impact on the organization, suitable for small projects, which reduces product line start-up time. Software Product Lines focuses on creating a family of products instead of individual products. This approach has spectacular benefits on reducing the time to market, maintaining the know-how, reducing the development costs and increasing the quality of new products. The maintenance of the products is also enhanced since all the data acquisition systems share the same product line architecture.
Validation and Verification of LADEE Models and Software
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen
2013-01-01
The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.
Ye, Xin
2018-01-01
The awareness of others’ activities has been widely recognized as essential in facilitating coordination in a team among Computer-Supported Cooperative Work communities. Several field studies of software developers in large software companies such as Microsoft have shown that coworker and artifact awareness are the most common information needs for software developers; however, they are also two of the seven most frequently unsatisfied information needs. To address this problem, we built a workspace awareness tool named TeamWATCH to visualize developer activities using a 3-D city metaphor. In this paper, we discuss the importance of awareness in software development, review existing workspace awareness tools, present the design and implementation of TeamWATCH, and evaluate how it could help detect and resolve conflicts earlier and better maintain group awareness via a controlled experiment. The experimental results showed that the subjects using TeamWATCH performed significantly better with respect to early conflict detection and resolution. PMID:29558519
Implications of Responsive Space on the Flight Software Architecture
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan
2006-01-01
The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.
Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data
NASA Technical Reports Server (NTRS)
Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.;
2013-01-01
Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
Model-Drive Architecture for Agent-Based Systems
NASA Technical Reports Server (NTRS)
Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.
2004-01-01
The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.
CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 3, May/June 2010
2010-06-01
optimization and ROI on MBTI training programs. Psychological Type The MBTI assessment is based on the work of Carl Jung, a Swiss psychiatrist who developed...artifacts is analogous to having an untrained team preparing and then selling raw, frozen burgers on a bun, with or without the cheese . In order to
ERIC Educational Resources Information Center
Sandler, Heidi J.
2016-01-01
The purpose of this grounded theory study was to examine the relationship between corporate culture (artifacts, values, and assumptions) and the creative endeavor of innovation in the software development industry. Innovation, the active implementation of creative ideas, is a widespread enterprise in the corporate world, especially in the areas of…
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemp, B.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kofler, J.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooley, R.
2016-06-15
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
Partial Automation of Requirements Tracing
NASA Technical Reports Server (NTRS)
Hayes, Jane; Dekhtyar, Alex; Sundaram, Senthil; Vadlamudi, Sravanthi
2006-01-01
Requirements Tracing on Target (RETRO) is software for after-the-fact tracing of textual requirements to support independent verification and validation of software. RETRO applies one of three user-selectable information-retrieval techniques: (1) term frequency/inverse document frequency (TF/IDF) vector retrieval, (2) TF/IDF vector retrieval with simple thesaurus, or (3) keyword extraction. One component of RETRO is the graphical user interface (GUI) for use in initiating a requirements-tracing project (a pair of artifacts to be traced to each other, such as a requirements spec and a design spec). Once the artifacts have been specified and the IR technique chosen, another component constructs a representation of the artifact elements and stores it on disk. Next, the IR technique is used to produce a first list of candidate links (potential matches between the two artifact levels). This list, encoded in Extensible Markup Language (XML), is optionally processed by a filtering component designed to make the list somewhat smaller without sacrificing accuracy. Through the GUI, the user examines a number of links and returns decisions (yes, these are links; no, these are not links). Coded in XML, these decisions are provided to a "feedback processor" component that prepares the data for the next application of the IR technique. The feedback reduces the incidence of erroneous candidate links. Unlike related prior software, RETRO does not require the user to assign keywords, and automatically builds a document index.
Enomoto, Yukiko; Yamauchi, Keita; Asano, Takahiko; Otani, Katharina; Iwama, Toru
2018-01-01
Background and purpose C-arm cone-beam computed tomography (CBCT) has the drawback that image quality is degraded by artifacts caused by implanted metal objects. We evaluated whether metal artifact reduction (MAR) prototype software can improve the subjective image quality of CBCT images of patients with intracranial aneurysms treated with coils or clips. Materials and methods Forty-four patients with intracranial aneurysms implanted with coils (40 patients) or clips (four patients) underwent one CBCT scan from which uncorrected and MAR-corrected CBCT image datasets were reconstructed. Three blinded readers evaluated the image quality of the image sets using a four-point scale (1: Excellent, 2: Good, 3: Poor, 4: Bad). The median scores of the three readers of uncorrected and MAR-corrected images were compared with the paired Wilcoxon signed-rank and inter-reader agreement of change scores was assessed by weighted kappa statistics. The readers also recorded new clinical findings, such as intracranial hemorrhage, air, or surrounding anatomical structures on MAR-corrected images. Results The image quality of MAR-corrected CBCT images was significantly improved compared with the uncorrected CBCT image ( p < 0.001). Additional clinical findings were seen on CBCT images of 70.4% of patients after MAR correction. Conclusion MAR software improved image quality of CBCT images degraded by metal artifacts.
Yue, Dong; Fan Rong, Cheng; Ning, Cai; Liang, Hu; Ai Lian, Liu; Ru Xin, Wang; Ya Hong, Luo
2018-07-01
Background The evaluation of hip arthroplasty is a challenge in computed tomography (CT). The virtual monochromatic spectral (VMS) images with metal artifact reduction software (MARs) in spectral CT can reduce the artifacts and improve the image quality. Purpose To evaluate the effects of VMS images and MARs for metal artifact reduction in patients with unilateral hip arthroplasty. Material and Methods Thirty-five patients underwent dual-energy CT. Four sets of VMS images without MARs and four sets of VMS images with MARs were obtained. Artifact index (AI), CT number, and SD value were assessed at the periprosthetic region and the pelvic organs. The scores of two observers for different images and the inter-observer agreement were evaluated. Results The AIs in 120 and 140 keV images were significantly lower than those in 80 and 100 keV images. The AIs of the periprosthetic region in VMS images with MARs were significantly lower than those in VMS images without MARs, while the AIs of pelvic organs were not significantly different. VMS images with MARs improved the accuracy of CT numbers for the periprosthetic region. The inter-observer agreements were good for all the images. VMS images with MARs at 120 and 140 keV had higher subjective scores and could improve the image quality, leading to reliable diagnosis of prosthesis-related problems. Conclusion VMS images with MARs at 120 and 140 keV could significantly reduce the artifacts from hip arthroplasty and improve the image quality at the periprosthetic region but had no obvious advantage for pelvic organs.
WE-G-209-00: Identifying Image Artifacts, Their Causes, and How to Fix Them
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
Reduction of metal artifacts from alloy hip prostheses in computer tomography.
Wang, Fengdan; Xue, Huadan; Yang, Xianda; Han, Wei; Qi, Bing; Fan, Yu; Qian, Wenwei; Wu, Zhihong; Zhang, Yan; Jin, Zhengyu
2014-01-01
The objective of this study was to evaluate the feasibility of reducing artifacts from large metal implants with gemstone spectral imaging (GSI) and metal artifact reduction software (MARS). Twenty-three in-vivo cobalt-chromium-molybdenum alloy total hip prostheses were prospectively scanned by fast kV-switching GSI between 80 and 140 kVp. The computed tomography images were reconstructed with monochromatic energy and with/without MARS. Both subjective and objective measurements were performed to assess the severity of metal artifacts. Increasing photon energy was associated with reduced metal artifacts in GSI images (P < 0.001). Combination of GSI with MARS further diminished the metal artifacts (P < 0.001). Artifact reduction at 3 anatomical levels (femoral head, neck, and shaft) were evaluated, with data showing that GSI and MARS could reduce metal artifacts at all 3 levels (P = 0.011, P < 0.001, and P = 0.003, respectively). Nevertheless, in certain cases, GSI without MARS produced more realistic images for the clinical situation. Proper usage of GSI with/without MARS could reduce the computed tomography artifacts of large metal parts and improve the radiological evaluation of postarthroplasty patients.
Consistent Evolution of Software Artifacts and Non-Functional Models
2014-11-14
induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the
A Predictive Approach to Eliminating Errors in Software Code
NASA Technical Reports Server (NTRS)
2006-01-01
NASA s Metrics Data Program Data Repository is a database that stores problem, product, and metrics data. The primary goal of this data repository is to provide project data to the software community. In doing so, the Metrics Data Program collects artifacts from a large NASA dataset, generates metrics on the artifacts, and then generates reports that are made available to the public at no cost. The data that are made available to general users have been sanitized and authorized for publication through the Metrics Data Program Web site by officials representing the projects from which the data originated. The data repository is operated by NASA s Independent Verification and Validation (IV&V) Facility, which is located in Fairmont, West Virginia, a high-tech hub for emerging innovation in the Mountain State. The IV&V Facility was founded in 1993, under the NASA Office of Safety and Mission Assurance, as a direct result of recommendations made by the National Research Council and the Report of the Presidential Commission on the Space Shuttle Challenger Accident. Today, under the direction of Goddard Space Flight Center, the IV&V Facility continues its mission to provide the highest achievable levels of safety and cost-effectiveness for mission-critical software. By extending its data to public users, the facility has helped improve the safety, reliability, and quality of complex software systems throughout private industry and other government agencies. Integrated Software Metrics, Inc., is one of the organizations that has benefited from studying the metrics data. As a result, the company has evolved into a leading developer of innovative software-error prediction tools that help organizations deliver better software, on time and on budget.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
A Model-Driven, Science Data Product Registration Service
NASA Astrophysics Data System (ADS)
Hardman, S.; Ramirez, P.; Hughes, J. S.; Joyner, R.; Cayanan, M.; Lee, H.; Crichton, D. J.
2011-12-01
The Planetary Data System (PDS) has undertaken an effort to overhaul the PDS data architecture (including the data model, data structures, data dictionary, etc.) and to deploy an upgraded software system (including data services, distributed data catalog, etc.) that fully embraces the PDS federation as an integrated system while taking advantage of modern innovations in information technology (including networking capabilities, processing speeds, and software breakthroughs). A core component of this new system is the Registry Service that will provide functionality for tracking, auditing, locating, and maintaining artifacts within the system. These artifacts can range from data files and label files, schemas, dictionary definitions for objects and elements, documents, services, etc. This service offers a single reference implementation of the registry capabilities detailed in the Consultative Committee for Space Data Systems (CCSDS) Registry Reference Model White Book. The CCSDS Reference Model in turn relies heavily on the Electronic Business using eXtensible Markup Language (ebXML) standards for registry services and the registry information model, managed by the OASIS consortium. Registries are pervasive components in most information systems. For example, data dictionaries, service registries, LDAP directory services, and even databases provide registry-like services. These all include an account of informational items that are used in large-scale information systems ranging from data values such as names and codes, to vocabularies, services and software components. The problem is that many of these registry-like services were designed with their own data models associated with the specific type of artifact they track. Additionally these services each have their own specific interface for interacting with the service. This Registry Service implements the data model specified in the ebXML Registry Information Model (RIM) specification that supports the various artifacts above as well as offering the flexibility to support customer-defined artifacts. Key features for the Registry Service include: - Model-based configuration specifying customer-defined artifact types, metadata attributes to capture for each artifact type, supported associations and classification schemes. - A REST-based external interface that is accessible via the Hypertext Transfer Protocol (HTTP). - Federation of Registry Service instances allowing associations between registered artifacts across registries as well as queries for artifacts across those same registries. A federation also enables features such as replication and synchronization if desired for a given deployment. In addition to its use as a core component of the PDS, the generic implementation of the Registry Service facilitates its applicability as a core component in any science data archive or science data system.
WE-G-209-01: Digital Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schueler, B.
Digital radiography, CT, PET, and MR are complicated imaging modalities which are composed of many hardware and software components. These components work together in a highly coordinated chain of events with the intent to produce high quality images. Acquisition, processing and reconstruction of data must occur in a precise way for optimum image quality to be achieved. Any error or unexpected event in the entire process can produce unwanted pixel intensities in the final images which may contribute to visible image artifacts. The diagnostic imaging physicist is uniquely qualified to investigate and contribute to resolution of image artifacts. This coursemore » will teach the participant to identify common artifacts found clinically in digital radiography, CT, PET, and MR, to determine the causes of artifacts, and to make recommendations for how to resolve artifacts. Learning Objectives: Identify common artifacts found clinically in digital radiography, CT, PET and MR. Determine causes of various clinical artifacts from digital radiography, CT, PET and MR. Describe how to resolve various clinical artifacts from digital radiography, CT, PET and MR.« less
Frauscher, Birgit; Gabelia, David; Biermayr, Marlene; Stefani, Ambra; Hackner, Heinz; Mitterling, Thomas; Poewe, Werner; Högl, Birgit
2014-10-01
Rapid eye movement sleep without atonia (RWA) is the polysomnographic hallmark of REM sleep behavior disorder (RBD). To partially overcome the disadvantages of manual RWA scoring, which is time consuming but essential for the accurate diagnosis of RBD, we aimed to validate software specifically developed and integrated with polysomnography for RWA detection against the gold standard of manual RWA quantification. Academic referral center sleep laboratory. Polysomnographic recordings of 20 patients with RBD and 60 healthy volunteers were analyzed. N/A. Motor activity during REM sleep was quantified manually and computer assisted (with and without artifact detection) according to Sleep Innsbruck Barcelona (SINBAR) criteria for the mentalis ("any," phasic, tonic electromyographic [EMG] activity) and the flexor digitorum superficialis (FDS) muscle (phasic EMG activity). Computer-derived indices (with and without artifact correction) for "any," phasic, tonic mentalis EMG activity, phasic FDS EMG activity, and the SINBAR index ("any" mentalis + phasic FDS) correlated well with the manually derived indices (all Spearman rhos 0.66-0.98). In contrast with computerized scoring alone, computerized scoring plus manual artifact correction (median duration 5.4 min) led to a significant reduction of false positives for "any" mentalis (40%), phasic mentalis (40.6%), and the SINBAR index (41.2%). Quantification of tonic mentalis and phasic FDS EMG activity was not influenced by artifact correction. The computer algorithm used here appears to be a promising tool for REM sleep behavior disorder detection in both research and clinical routine. A short check for plausibility of automatic detection should be a basic prerequisite for this and all other available computer algorithms. © 2014 Associated Professional Sleep Societies, LLC.
NASA Astrophysics Data System (ADS)
Collins, J.; Riegler, G.; Schrader, H.; Tinz, M.
2015-04-01
The Geo-intelligence division of Airbus Defence and Space and the German Aerospace Center (DLR) have partnered to produce the first fully global, high-accuracy Digital Surface Model (DSM) using SAR data from the twin satellite constellation: TerraSAR-X and TanDEM-X. The DLR is responsible for the processing and distribution of the TanDEM-X elevation model for the world's scientific community, while Airbus DS is responsible for the commercial production and distribution of the data, under the brand name WorldDEM. For the provision of a consumer-ready product, Airbus DS undertakes several steps to reduce the effect of radar-specific artifacts in the WorldDEM data. These artifacts can be divided into two categories: terrain and hydrological. Airbus DS has developed proprietary software and processes to detect and correct these artifacts in the most efficient manner. Some processes are fullyautomatic, while others require manual or semi-automatic control by operators.
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
ARCHAEO-SCAN: Portable 3D shape measurement system for archaeological field work
NASA Astrophysics Data System (ADS)
Knopf, George K.; Nelson, Andrew J.
2004-10-01
Accurate measurement and thorough documentation of excavated artifacts are the essential tasks of archaeological fieldwork. The on-site recording and long-term preservation of fragile evidence can be improved using 3D spatial data acquisition and computer-aided modeling technologies. Once the artifact is digitized and geometry created in a virtual environment, the scientist can manipulate the pieces in a virtual reality environment to develop a "realistic" reconstruction of the object without physically handling or gluing the fragments. The ARCHAEO-SCAN system is a flexible, affordable 3D coordinate data acquisition and geometric modeling system for acquiring surface and shape information of small to medium sized artifacts and bone fragments. The shape measurement system is being developed to enable the field archaeologist to manually sweep the non-contact sensor head across the relic or artifact surface. A series of unique data acquisition, processing, registration and surface reconstruction algorithms are then used to integrate 3D coordinate information from multiple views into a single reference frame. A novel technique for automatically creating a hexahedral mesh of the recovered fragments is presented. The 3D model acquisition system is designed to operate from a standard laptop with minimal additional hardware and proprietary software support. The captured shape data can be pre-processed and displayed on site, stored digitally on a CD, or transmitted via the Internet to the researcher's home institution.
Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.
Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
Reynoso, Exequiel; Capunay, Carlos; Rasumoff, Alejandro; Vallejos, Javier; Carpio, Jimena; Lago, Karen; Carrascosa, Patricia
2016-01-01
The aim of this study was to explore the usefulness of combined virtual monochromatic imaging and metal artifact reduction software (MARS) for the evaluation of musculoskeletal periprosthetic tissue. Measurements were performed in periprosthetic and remote regions in 80 patients using a high-definition scanner. Polychromatic images with and without MARS and virtual monochromatic images were obtained. Periprosthetic polychromatic imaging (PI) showed significant differences compared with remote areas among the 3 tissues explored (P < 0.0001). No significant differences were observed between periprosthetic and remote tissues using monochromatic imaging with MARS (P = 0.053 bone, P = 0.32 soft tissue, and P = 0.13 fat). However, such differences were significant using PI with MARS among bone (P = 0.005) and fat (P = 0.02) tissues. All periprosthetic areas were noninterpretable using PI, compared with 11 (9%) using monochromatic imaging. The combined use of virtual monochromatic imaging and MARS reduced periprosthetic artifacts, achieving attenuation levels comparable to implant-free tissue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, V; Kohli, K
Purpose: Metal artifact reduction (MAR) software in computed tomography (CT) was previously evaluated with phantoms demonstrating the algorithm is capable of reducing metal artifacts without affecting the overall image quality. The goal of this study is to determine the dosimetric impact when calculating with CT datasets reconstructed with and without MAR software. Methods: Twelve head and neck cancer patients with dental fillings and four pelvic cancer patients with hip prosthesis were scanned with a GE Optima RT 580 CT scanner. Images were reconstructed with and without the MAR software. 6MV IMRT and VMAT plans were calculated with AAA on themore » MAR dataset until all constraints met our clinic’s guidelines. Contours from the MAR dataset were copied to the non-MAR dataset. Next, dose calculation on the non-MAR dataset was performed using the same field arrangements and fluence as the MAR plan. Conformality index, D99% and V100% to PTV were compared between MAR and non-MAR plans. Results: Differences between MAR and non-MAR plans were evaluated. For head and neck plans, the largest variations in conformality index, D99% and V100% were −3.8%, −0.9% and −2.1% respectively whereas for pelvic plans, the biggest discrepancies were −32.7%, −0.4% and -33.5% respectively. The dosimetric impact from hip prosthesis is greater because it produces more artifacts compared to dental fillings. Coverage to PTV can increase or decrease depending on the artifacts since dark streaks reduce the HU whereas bright streaks increase the HU. In the majority of the cases, PTV dose in the non-MAR plans is higher than MAR plans. Conclusion: With the presence of metals, MAR algorithm can allow more accurate delineation of targets and OARs. Dose difference between MAR and non-MAR plans depends on the proximity of the organ to the high density material, the streaking artifacts and the beam arrangements of the plan.« less
A code inspection process for security reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; /Fermilab
2009-05-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application andmore » their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.« less
A code inspection process for security reviews
NASA Astrophysics Data System (ADS)
Garzoglio, Gabriele
2010-04-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.
Automated Generation of Fault Management Artifacts from a Simple System Model
NASA Technical Reports Server (NTRS)
Kennedy, Andrew K.; Day, John C.
2013-01-01
Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.
Comparison of quality control software tools for diffusion tensor imaging.
Liu, Bilan; Zhu, Tong; Zhong, Jianhui
2015-04-01
Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.
[Design of a pulse oximeter used to low perfusion and low oxygen saturation].
Tan, Shuangping; Ai, Zhiguang; Yang, Yuxing; Xie, Qingguo
2013-05-01
This paper presents a new pulse oximeter used to low perfusion at 0.125% and wide oxygen saturation range from 35% to 100%. In order to acquire the best PPG signals, the variable gain amplifier(VGA) is adopted in hardware. The self-developed auto-correlation modeling method is adopted in software and it can extract pulse wave from low perfusion signals and remove motion artifacts partly.
Dunet, Vincent; Bernasconi, Martine; Hajdu, Steven David; Meuli, Reto Antoine; Daniel, Roy Thomas; Zerlauth, Jean-Baptiste
2017-09-01
We aimed to assess the impact of metal artifact reduction software (MARs) on image quality of gemstone spectral imaging (GSI) dual-energy (DE) cerebral CT angiography (CTA) after intracranial aneurysm clipping. This retrospective study was approved by the institutional review board, which waived patient written consent. From January 2013 to September 2016, single source DE cerebral CTA were performed in 45 patients (mean age: 60 ± 9 years, male 9) after intracranial aneurysm clipping and reconstructed with and without MARs. Signal-to-noise (SNR), contrast-to-noise (CNR), and relative CNR (rCNR) ratios were calculated from attenuation values measured in the internal carotid artery (ICA) and middle cerebral artery (MCA). Volume of clip and artifacts and relative clip blurring reduction (rCBR) ratios were also measured at each energy level with/without MARs. Variables were compared between GSI and GSI-MARs using the paired Wilcoxon signed-rank test. MARs significantly reduced metal artifacts at all energy levels but 130 and 140 keV, regardless of clips' location and number. The optimal rCBR was obtained at 110 and 80 keV, respectively, on GSI and GSI-MARs images, with up to 96% rCNR increase on GSI-MARs images. The best compromise between metal artifact reduction and rCNR was obtained at 70-75 and 65-70 keV for GSI and GSI-MARs images, respectively, with up to 15% rCBR and rCNR increase on GSI-MARs images. MARs significantly reduces metal artifacts on DE cerebral CTA after intracranial aneurysm clipping regardless of clips' location and number. It may be used to reduce radiation dose while increasing CNR.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
Capturing Requirements for Autonomous Spacecraft with Autonomy Requirements Engineering
NASA Astrophysics Data System (ADS)
Vassev, Emil; Hinchey, Mike
2014-08-01
The Autonomy Requirements Engineering (ARE) approach has been developed by Lero - the Irish Software Engineering Research Center within the mandate of a joint project with ESA, the European Space Agency. The approach is intended to help engineers develop missions for unmanned exploration, often with limited or no human control. Such robotics space missions rely on the most recent advances in automation and robotic technologies where autonomy and autonomic computing principles drive the design and implementation of unmanned spacecraft [1]. To tackle the integration and promotion of autonomy in software-intensive systems, ARE combines generic autonomy requirements (GAR) with goal-oriented requirements engineering (GORE). Using this approach, software engineers can determine what autonomic features to develop for a particular system (e.g., a space mission) as well as what artifacts that process might generate (e.g., goals models, requirements specification, etc.). The inputs required by this approach are the mission goals and the domain-specific GAR reflecting specifics of the mission class (e.g., interplanetary missions).
Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J
2012-09-18
Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.
Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M
2017-04-01
In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.
Brook, Olga R; Gourtsoyianni, Sofia; Brook, Alexander; Mahadevan, Anand; Wilcox, Carol; Raptopoulos, Vassilios
2012-06-01
To evaluate spectral computed tomography (CT) with metal artifacts reduction software (MARS) for reduction of metal artifacts associated with gold fiducial seeds. Thirteen consecutive patients with 37 fiducial seeds implanted for radiation therapy of abdominal lesions were included in this HIPAA-compliant, institutional review board-approved prospective study. Six patients were women (46%) and seven were men (54%). The mean age was 61.1 years (median, 58 years; range, 29-78 years). Spectral imaging was used for arterial phase CT. Images were reconstructed with and without MARS in axial, coronal, and sagittal planes. Two radiologists independently reviewed reconstructions and selected the best image, graded the visibility of the tumor, and assessed the amount of artifacts in all planes. A linear-weighted κ statistic and Wilcoxon signed-rank test were used to assess interobserver variability. Histogram analysis with the Kolmogorov-Smirnov test was used for objective evaluation of artifacts reduction. Fiducial seeds were placed in pancreas (n = 5), liver (n = 7), periportal lymph nodes (n = 1), and gallbladder bed (n = 1). MARS-reconstructed images received a better grade than those with standard reconstruction in 60% and 65% of patients by the first and second radiologist, respectively. Tumor visibility was graded higher with standard versus MARS reconstruction (grade, 3.7 ± 1.0 vs 2.8 ± 1.1; P = .001). Reduction of blooming was noted on MARS-reconstructed images (P = .01). Amount of artifacts, for both any and near field, was significantly smaller on sagittal and coronal MARS-reconstructed images than on standard reconstructions (P < .001 for all comparisons). Far-field artifacts were more prominent on axial MARS-reconstructed images than on standard reconstructions (P < .01). Linear-weighted κ statistic showed moderate to perfect agreement between radiologists. CT number distribution was narrower with MARS than with standard reconstruction in 35 of 37 patients (P < .001). Spectral CT with use of MARS improved tumor visibility in the vicinity of gold fiducial seeds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fusella, M; Loi, G; Fiandra, C
Purpose: To investigate the accuracy and robustness, against image noise and artifacts (typical of CBCT images), of a commercial algorithm for deformable image registration (DIR), to propagate regions of interest (ROIs) in computational phantoms based on real prostate patient images. Methods: The Anaconda DIR algorithm, implemented in RayStation was tested. Two specific Deformation Vector Fields (DVFs) were applied to the reference data set (CTref) using the ImSimQA software, obtaining two deformed CTs. For each dataset twenty-four different level of noise and/or capping artifacts were applied to simulate CBCT images. DIR was performed between CTref and each deformed CTs and CBCTs.more » In order to investigate the relationship between image quality parameters and the DIR results (expressed by a logit transform of the Dice Index) a bilinear regression was defined. Results: More than 550 DIR-mapped ROIs were analyzed. The Statistical analysis states that deformation strenght and artifacts were significant prognostic factors of DIR performances, while noise appeared to have a minor role in DIR process as implemented in RayStation as expected by the image similarity metric built in the registration algorithm. Capping artifacts reveals a determinant role for the accuracy of DIR results. Two optimal values for capping artifacts were found to obtain acceptable DIR results (DICE> 075/ 0.85). Various clinical CBCT acquisition protocol were reported to evaluate the significance of the study. Conclusion: This work illustrates the impact of image quality on DIR performance. Clinical issues like Adaptive Radiation Therapy (ART) and Dose Accumulation need accurate and robust DIR software. The RayStation DIR algorithm resulted robust against noise, but sensitive to image artifacts. This result highlights the need of robustness quality assurance against image noise and artifacts in the commissioning of a DIR commercial system and underlines the importance to adopt optimized protocols for CBCT image acquisitions in ART clinical implementation.« less
A graphical user interface for infant ERP analysis.
Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka
2014-09-01
Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.
Cardiac gating with a pulse oximeter for dual-energy imaging
NASA Astrophysics Data System (ADS)
Shkumat, N. A.; Siewerdsen, J. H.; Dhanantwari, A. C.; Williams, D. B.; Paul, N. S.; Yorkston, J.; Van Metter, R.
2008-11-01
The development and evaluation of a prototype cardiac gating system for double-shot dual-energy (DE) imaging is described. By acquiring both low- and high-kVp images during the resting phase of the cardiac cycle (diastole), heart misalignment between images can be reduced, thereby decreasing the magnitude of cardiac motion artifacts. For this initial implementation, a fingertip pulse oximeter was employed to measure the peripheral pulse waveform ('plethysmogram'), offering potential logistic, cost and workflow advantages compared to an electrocardiogram. A gating method was developed that accommodates temporal delays due to physiological pulse propagation, oximeter waveform processing and the imaging system (software, filter-wheel, anti-scatter Bucky-grid and flat-panel detector). Modeling the diastolic period allowed the calculation of an implemented delay, timp, required to trigger correctly during diastole at any patient heart rate (HR). The model suggests a triggering scheme characterized by two HR regimes, separated by a threshold, HRthresh. For rates at or below HRthresh, sufficient time exists to expose on the same heartbeat as the plethysmogram pulse [timp(HR) = 0]. Above HRthresh, a characteristic timp(HR) delays exposure to the subsequent heartbeat, accounting for all fixed and variable system delays. Performance was evaluated in terms of accuracy and precision of diastole-trigger coincidence and quantitative evaluation of artifact severity in gated and ungated DE images. Initial implementation indicated 85% accuracy in diastole-trigger coincidence. Through the identification of an improved HR estimation method (modified temporal smoothing of the oximeter waveform), trigger accuracy of 100% could be achieved with improved precision. To quantify the effect of the gating system on DE image quality, human observer tests were conducted to measure the magnitude of cardiac artifact under conditions of successful and unsuccessful diastolic gating. Six observers independently measured the artifact in 111 patient DE images. The data indicate that successful diastolic gating results in a statistically significant reduction (p < 0.001) in the magnitude of cardiac motion artifact, with residual artifact attributed primarily to gross patient motion.
Quantitative quality assurance in a multicenter HARDI clinical trial at 3T.
Zhou, Xiaopeng; Sakaie, Ken E; Debbins, Josef P; Kirsch, John E; Tatsuoka, Curtis; Fox, Robert J; Lowe, Mark J
2017-01-01
A phantom-based quality assurance (QA) protocol was developed for a multicenter clinical trial including high angular resolution diffusion imaging (HARDI). A total of 27 3T MR scanners from 2 major manufacturers, GE (Discovery and Signa scanners) and Siemens (Trio and Skyra scanners), were included in this trial. With this protocol, agar phantoms doped to mimic relaxation properties of brain tissue are scanned on a monthly basis, and quantitative procedures are used to detect spiking and to evaluate eddy current and Nyquist ghosting artifacts. In this study, simulations were used to determine alarm thresholds for minimal acceptable signal-to-noise ratio (SNR). Our results showed that spiking artifact was the most frequently observed type of artifact. Overall, Trio scanners exhibited less eddy current distortion than GE scanners, which in turn showed less distortion than Skyra scanners. This difference was mainly caused by the different sequences used on these scanners. The SNR for phantom scans was closely correlated with the SNR from volunteers. Nearly all of the phantom measurements with artifact-free images were above the alarm threshold, suggesting that the scanners are stable longitudinally. Software upgrades and hardware replacement sometimes affected SNR substantially but sometimes did not. In light of these results, it is important to monitor longitudinal SNR with phantom QA to help interpret potential effects on in vivo measurements. Our phantom QA procedure for HARDI scans was successful in tracking scanner performance and detecting unwanted artifacts. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Grycewicz, Thomas J.; Tan, Bin; Isaacson, Peter J.; De Luccia, Frank J.; Dellomo, John
2016-01-01
In developing software for independent verification and validation (IVV) of the Image Navigation and Registration (INR) capability for the Geostationary Operational Environmental Satellite R Series (GOES-R) Advanced Baseline Imager (ABI), we have encountered an image registration artifact which limits the accuracy of image offset estimation at the subpixel scale using image correlation. Where the two images to be registered have the same pixel size, subpixel image registration preferentially selects registration values where the image pixel boundaries are close to lined up. Because of the shape of a curve plotting input displacement to estimated offset, we call this a stair-step artifact. When one image is at a higher resolution than the other, the stair-step artifact is minimized by correlating at the higher resolution. For validating ABI image navigation, GOES-R images are correlated with Landsat-based ground truth maps. To create the ground truth map, the Landsat image is first transformed to the perspective seen from the GOES-R satellite, and then is scaled to an appropriate pixel size. Minimizing processing time motivates choosing the map pixels to be the same size as the GOES-R pixels. At this pixel size image processing of the shift estimate is efficient, but the stair-step artifact is present. If the map pixel is very small, stair-step is not a problem, but image correlation is computation-intensive. This paper describes simulation-based selection of the scale for truth maps for registering GOES-R ABI images.
Quantitative Quality Assurance in a Multicenter HARDI Clinical Trial at 3T
Zhou, Xiaopeng; Sakaie, Ken E.; Debbins, Josef P.; Kirsch, John E.; Tatsuoka, Curtis; Fox, Robert J.; Lowe, Mark J.
2016-01-01
A phantom-based quality assurance (QA) protocol was developed for a multicenter clinical trial including high angular resolution diffusion imaging (HARDI). A total of 27 3T MR scanners from 2 major manufacturers, GE (Discovery and Signa scanners) and Siemens (Trio and Skyra scanners), were included in this trial. With this protocol, agar phantoms doped to mimic relaxation properties of brain tissue are scanned on a monthly basis, and quantitative procedures are used to detect spiking and to evaluate eddy current and Nyquist ghosting artifacts. In this study, simulations were used to determine alarm thresholds for minimal acceptable signal-to-noise ratio (SNR). Our results showed that spiking artifact was the most frequently observed type of artifact. Overall, Trio scanners exhibited less eddy current distortion than GE scanners, which in turn showed less distortion than Skyra scanners. This difference was mainly caused by the different sequences used on these scanners. The SNR for phantom scans was closely correlated with the SNR from volunteers. Nearly all of the phantom measurements with artifact-free images were above the alarm threshold, suggesting that the scanners are stable longitudinally. Software upgrades and hardware replacement sometimes affected SNR substantially but sometimes did not. In light of these results, it is important to monitor longitudinal SNR with phantom QA to help interpret potential effects on in vivo measurements. Our phantom QA procedure for HARDI scans was successful in tracking scanner performance and detecting unwanted artifacts. PMID:27587227
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
Fiberfox: facilitating the creation of realistic white matter software phantoms.
Neher, Peter F; Laun, Frederik B; Stieltjes, Bram; Maier-Hein, Klaus H
2014-11-01
Phantom-based validation of diffusion-weighted image processing techniques is an important key to innovation in the field and is widely used. Openly available and user friendly tools for the flexible generation of tailor-made datasets for the specific tasks at hand can greatly facilitate the work of researchers around the world. We present an open-source framework, Fiberfox, that enables (1) the intuitive definition of arbitrary artificial white matter fiber tracts, (2) signal generation from those fibers by means of the most recent multi-compartment modeling techniques, and (3) simulation of the actual MR acquisition that allows for the introduction of realistic MRI-related effects into the final image. We show that real acquisitions can be closely approximated by simulating the acquisition of the well-known FiberCup phantom. We further demonstrate the advantages of our framework by evaluating the effects of imaging artifacts and acquisition settings on the outcome of 12 tractography algorithms. Our findings suggest that experiments on a realistic software phantom might change the conclusions drawn from earlier hardware phantom experiments. Fiberfox may find application in validating and further developing methods such as tractography, super-resolution, diffusion modeling or artifact correction. Copyright © 2013 Wiley Periodicals, Inc.
Ring artifact reduction in synchrotron x-ray tomography through helical acquisition
NASA Astrophysics Data System (ADS)
Pelt, Daniël M.; Parkinson, Dilworth Y.
2018-03-01
In synchrotron x-ray tomography, systematic defects in certain detector elements can result in arc-shaped artifacts in the final reconstructed image of the scanned sample. These ring artifacts are commonly found in many applications of synchrotron tomography, and can make it difficult or impossible to use the reconstructed image in further analyses. The severity of ring artifacts is often reduced in practice by applying pre-processing on the acquired data, or post-processing on the reconstructed image. However, such additional processing steps can introduce additional artifacts as well, and rely on specific choices of hyperparameter values. In this paper, a different approach to reducing the severity of ring artifacts is introduced: a helical acquisition mode. By moving the sample parallel to the rotation axis during the experiment, the sample is detected at different detector positions in each projection, reducing the effect of systematic errors in detector elements. Alternatively, helical acquisition can be viewed as a way to transform ring artifacts to helix-like artifacts in the reconstructed volume, reducing their severity. We show that data acquired with the proposed mode can be transformed to data acquired with a virtual circular trajectory, enabling further processing of the data with existing software packages for circular data. Results for both simulated data and experimental data show that the proposed method is able to significantly reduce ring artifacts in practice, even compared with popular existing methods, without introducing additional artifacts.
Yang, Qiuxia; Peng, Sheng; Wu, Jing; Ban, Xiaohua; He, Mingyan; Xie, Chuanmiao; Zhang, Rong
2015-11-01
To investigate the optimal monochromatic energy for artifacts reduction from (125)I seeds as well as image improvement in the vicinity of seeds on monochromatic images with and without metal artifacts reduction software (MARS) and to compare this with traditional 120-kVp images, so as to evaluate the application value of gemstone spectral imaging for reducing artifacts from (125)I seeds in liver brachytherapy. A total of 45 tumors from 25 patients treated with (125)I seed brachytherapy in the liver were enrolled in this study. Multiphasic spectral computed tomography (CT) scanning was performed for each patient. After a delay time of 15 s of portal vein phase, a traditional 120-kVp scan was performed, focusing on several planes of (125)I seeds only. The artifact index (AI) in the vicinity of seeds and the standard deviation (SD) of the CT density of region of interest in the outside liver parenchyma were calculated. Artifact appearance was evaluated and classified on reconstructed monochromatic S and 120-kVp images. Image quality in the vicinity of seeds of three data sets were evaluated using a 1-5 scale scoring method. The Friedman rank-sum test was used to estimate the scoring results of image quality. The greatest noise in monochromatic images was found at 40 keV (SD = 27.38, AI = 206.40). The optimal monochromatic energy was found at 75 keV, which provided almost the least image noise (SD = 10.01) and good performance in artifact reduction (AI = 102.73). Image noise and AI reduction at 75 keV was decreased by 63.44 and 50.23%, compared with at 40 keV. Near-field thick artifacts were obvious in all 45 lesions, in 120-kVp images, and 75-keV images, but basically reduced in 75 keV MARS images and artifacts completely invisible in 7 lesions. The number of diagnosable images (score ≥3) was significantly more in the 75-keV MARS group (28/45), and the 75-keV group (22/45) than in the 120-kVp group (11/45) (p < 0.0167 for both). Compared with 120-kVp images alone, 75-keV images plus 75-keV MARS images can increase tumor visibility around seeds and increase the proportion of diagnostic images to 84.4% (38/45). Spectral CT producing 75-keV MARS images could substantially reduce near-field thick artifacts caused by (125)I seeds and improve image quality, even to a state of being completely free from artifacts. Spectral CT imaging (with and without MARS) can provide more accurate CT images for estimating efficacy after (125)I seed brachytherapy in the liver.
NASA Astrophysics Data System (ADS)
Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2009-08-01
One of the roles of the VIIRS Ocean Science Team (VOST) is to assess the performance of the instrument and scientific processing software that generates ocean color parameters such as normalized water-leaving radiances and chlorophyll. A VIIRS data simulator is being developed to help aid in this work. The simulator will create a sufficient set of simulated Sensor Data Records (SDR) so that the ocean component of the VIIRS processing system can be tested. It will also have the ability to study the impact of instrument artifacts on the derived parameter quality. The simulator will use existing resources available to generate the geolocation information and to transform calibrated radiances to geophysical parameters and visa-versa. In addition, the simulator will be able to introduce land features, cloud fields, and expected VIIRS instrument artifacts. The design of the simulator and its progress will be presented.
A SQL-Database Based Meta-CASE System and its Query Subsystem
NASA Astrophysics Data System (ADS)
Eessaar, Erki; Sgirka, Rünno
Meta-CASE systems simplify the creation of CASE (Computer Aided System Engineering) systems. In this paper, we present a meta-CASE system that provides a web-based user interface and uses an object-relational database system (ORDBMS) as its basis. The use of ORDBMSs allows us to integrate different parts of the system and simplify the creation of meta-CASE and CASE systems. ORDBMSs provide powerful query mechanism. The proposed system allows developers to use queries to evaluate and gradually improve artifacts and calculate values of software measures. We illustrate the use of the systems by using SimpleM modeling language and discuss the use of SQL in the context of queries about artifacts. We have created a prototype of the meta-CASE system by using PostgreSQL™ ORDBMS and PHP scripting language.
New software for raw data mask processing increases diagnostic ability of myocardial SPECT imaging.
Tanaka, Ryo; Yoshioka, Katsunori; Seino, Kazue; Ohba, Muneo; Nakamura, Tomoharu; Shimada, Katsuhiko
2011-05-01
Increased activity of myocardial perfusion tracer technetium-99m in liver and hepatobiliary system causes streak artifacts, which may affect clinical diagnosis. We developed a mask-processing tool for raw data generated using technetium-99m as a myocardial perfusion tracer. Here, we describe improvements in image quality under the influence of artifacts caused by high accumulation in other organs. A heart phantom (RH-2) containing 15 MBq of pertechnetate was defined as model A. Model B was designed in the same phantom containing ten times of cardiac radioactivity overlapping with other organs. Variance in the vertical profile count in the lower part of the myocardial inferior wall and in the myocardial circumferential profile curve were investigated in a phantom and clinical cases using our raw data masking (RDM) software. The profile variances at lower parts of myocardial inferior walls were 965.43 in model A, 1390.11 in model B and 815.85 in B-RDM. The mean ± SD of myocardial circumferential profile curves were 83.91 ± 7.39 in model A, 69.61 ± 11.45 in model B and 82.68 ± 9.71 in model B-RDM. For 11 clinical images with streak artifacts, the average of the variance significantly differed between with and without RDM (3.95 vs. 21.05; P < 0.05). For 50 clinical images with hepatic accumulation artifacts, the average of the variance on vertical profiles on images with and without RDM significantly differed (5.99 vs. 15.59; P < 0.01). Furthermore, when a segment with <60% uptake in polar maps was defined as abnormal, the average extent score of 1 h (Tc-1h), 5 min of RDM (Tc-0h-RDM) and 5 min of non-RDM (Tc-0h-non-RDM) were 2.25 ± 3.12, 2.35 ± 3.16, and 1.37 ± 2.41, respectively. Differences were significant between Tc-1h and Tc-0h-non-RDM (P < 0.005) but not between Tc-1h and Tc-0h-RDM. Batch processing was enabled in all frames by shifting the myocardium to the center of rotation using this software. The waiting time between infusion and image acquisition should be decreased, thus reducing patient burden and improving the diagnostic ability of the procedure.
2011-09-01
to show cryptographic signature # generation on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp/csdb CODEBASE=. touch "$CSDB" find "$CODEBASE" -type f...artifacts generated earlier. 81 #! /bin/sh # # Demo program to show cryptographic signature # verification on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp
Astigmatism corrected common path probe for optical coherence tomography.
Singh, Kanwarpal; Yamada, Daisuke; Tearney, Guillermo
2017-03-01
Optical coherence tomography (OCT) catheters for intraluminal imaging are subject to various artifacts due to reference-sample arm dispersion imbalances and sample arm beam astigmatism. The goal of this work was to develop a probe that minimizes such artifacts. Our probe was fabricated using a single mode fiber at the tip of which a glass spacer and graded index objective lens were spliced to achieve the desired focal distance. The signal was reflected using a curved reflector to correct for astigmatism caused by the thin, protective, transparent sheath that surrounds the optics. The probe design was optimized using Zemax, a commercially available optical design software. Common path interferometric operation was achieved using Fresnel reflection from the tip of the focusing graded index objective lens. The performance of the probe was tested using a custom designed spectrometer-based OCT system. The probe achieved an axial resolution of 15.6 μm in air, a lateral resolution 33 μm, and a sensitivity of 103 dB. A scattering tissue phantom was imaged to test the performance of the probe for astigmatism correction. Images of the phantom confirmed that this common-path, astigmatism-corrected OCT imaging probe had minimal artifacts in the axial, and lateral dimensions. In this work, we developed an astigmatism-corrected, common path probe that minimizes artifacts associated with standard OCT probes. This design may be useful for OCT applications that require high axial and lateral resolutions. Lasers Surg. Med. 49:312-318, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, P; Schreibmann, E; Fox, T
2014-06-15
Purpose: Severe CT artifacts can impair our ability to accurately calculate proton range thereby resulting in a clinically unacceptable treatment plan. In this work, we investigated a novel CT artifact correction method based on a coregistered MRI and investigated its ability to estimate CT HU and proton range in the presence of severe CT artifacts. Methods: The proposed method corrects corrupted CT data using a coregistered MRI to guide the mapping of CT values from a nearby artifact-free region. First patient MRI and CT images were registered using 3D deformable image registration software based on B-spline and mutual information. Themore » CT slice with severe artifacts was selected as well as a nearby slice free of artifacts (e.g. 1cm away from the artifact). The two sets of paired MRI and CT images at different slice locations were further registered by applying 2D deformable image registration. Based on the artifact free paired MRI and CT images, a comprehensive geospatial analysis was performed to predict the correct CT HU of the CT image with severe artifact. For a proof of concept, a known artifact was introduced that changed the ground truth CT HU value up to 30% and up to 5cm error in proton range. The ability of the proposed method to recover the ground truth was quantified using a selected head and neck case. Results: A significant improvement in image quality was observed visually. Our proof of concept study showed that 90% of area that had 30% errors in CT HU was corrected to 3% of its ground truth value. Furthermore, the maximum proton range error up to 5cm was reduced to 4mm error. Conclusion: MRI based CT artifact correction method can improve CT image quality and proton range calculation for patients with severe CT artifacts.« less
Variability extraction and modeling for product variants.
Linsbauer, Lukas; Lopez-Herrejon, Roberto Erick; Egyed, Alexander
2017-01-01
Fast-changing hardware and software technologies in addition to larger and more specialized customer bases demand software tailored to meet very diverse requirements. Software development approaches that aim at capturing this diversity on a single consolidated platform often require large upfront investments, e.g., time or budget. Alternatively, companies resort to developing one variant of a software product at a time by reusing as much as possible from already-existing product variants. However, identifying and extracting the parts to reuse is an error-prone and inefficient task compounded by the typically large number of product variants. Hence, more disciplined and systematic approaches are needed to cope with the complexity of developing and maintaining sets of product variants. Such approaches require detailed information about the product variants, the features they provide and their relations. In this paper, we present an approach to extract such variability information from product variants. It identifies traces from features and feature interactions to their implementation artifacts, and computes their dependencies. This work can be useful in many scenarios ranging from ad hoc development approaches such as clone-and-own to systematic reuse approaches such as software product lines. We applied our variability extraction approach to six case studies and provide a detailed evaluation. The results show that the extracted variability information is consistent with the variability in our six case study systems given by their variability models and available product variants.
O'Daniel, Jennifer C; Rosenthal, David I; Garden, Adam S; Barker, Jerry L; Ahamad, Anesa; Ang, K Kian; Asper, Joshua A; Blanco, Angel I; de Crevoisier, Renaud; Holsinger, F Christopher; Patel, Chirag B; Schwartz, David L; Wang, He; Dong, Lei
2007-04-01
To investigate interobserver variability in the delineation of head-and-neck (H&N) anatomic structures on CT images, including the effects of image artifacts and observer experience. Nine observers (7 radiation oncologists, 1 surgeon, and 1 physician assistant) with varying levels of H&N delineation experience independently contoured H&N gross tumor volumes and critical structures on radiation therapy treatment planning CT images alongside reference diagnostic CT images for 4 patients with oropharynx cancer. Image artifacts from dental fillings partially obstructed 3 images. Differences in the structure volumes, center-of-volume positions, and boundary positions (1 SD) were measured. In-house software created three-dimensional overlap distributions, including all observers. The effects of dental artifacts and observer experience on contouring precision were investigated, and the need for contrast media was assessed. In the absence of artifacts, all 9 participants achieved reasonable precision (1 SD < or =3 mm all boundaries). The structures obscured by dental image artifacts had larger variations when measured by the 3 metrics (1 SD = 8 mm cranial/caudal boundary). Experience improved the interobserver consistency of contouring for structures obscured by artifacts (1 SD = 2 mm cranial/caudal boundary). Interobserver contouring variability for anatomic H&N structures, specifically oropharyngeal gross tumor volumes and parotid glands, was acceptable in the absence of artifacts. Dental artifacts increased the contouring variability, but experienced participants achieved reasonable precision even with artifacts present. With a staging contrast CT image as a reference, delineation on a noncontrast treatment planning CT image can achieve acceptable precision.
Demystifying Kepler Data: A Primer for Systematic Artifact Mitigation
NASA Astrophysics Data System (ADS)
Kinemuchi, K.; Barclay, T.; Fanelli, M.; Pepper, J.; Still, M.; Howell, Steve B.
2012-09-01
The Kepler spacecraft has collected data of high photometric precision and cadence almost continuously since operations began on 2009 May 2. Primarily designed to detect planetary transits and asteroseismological signals from solar-like stars, Kepler has provided high-quality data for many areas of investigation. Unconditioned simple aperture time-series photometry is, however, affected by systematic structure. Examples of these systematics include differential velocity aberration, thermal gradients across the spacecraft, and pointing variations. While exhibiting some impact on Kepler’s primary science, these systematics can critically handicap potentially ground-breaking scientific gains in other astrophysical areas, especially over long timescales greater than 10 days. As the data archive grows to provide light curves for 105 stars of many years in length, Kepler will only fulfill its broad potential for stellar astrophysics if these systematics are understood and mitigated. Post-launch developments in the Kepler archive, data reduction pipeline and open source data analysis software have helped to remove or reduce systematic artifacts. This paper provides a conceptual primer to help users of the Kepler data archive understand and recognize systematic artifacts within light curves and some methods for their removal. Specific examples of artifact mitigation are provided using data available within the archive. Through the methods defined here, the Kepler community will find a road map to maximizing the quality and employment of the Kepler legacy archive.
Demystifying Kepler Data: A Primer for Systematic Artifact Mitigation
NASA Technical Reports Server (NTRS)
Kinemuchi, K.; Barclay, T.; Fanelli, M.; Pepper, J.; Still, M.; Howell, B.
2012-01-01
The Kepler spacecraft has collected data of high photometric precision and cadence almost continuously since operations began on 2009 May 2. Primarily designed to detect planetary transits and asteroseismological signals from solar-like stars, Kepler has provided high quality data for many areas of investigation. Unconditioned simple aperture time-series photometry are however affected by systematic structure. Examples of these systematics are differential velocity aberration, thermal gradients across the spacecraft, and pointing variations. While exhibiting some impact on Kepler's primary science, these systematics can critically handicap potentially ground-breaking scientific gains in other astrophysical areas, especially over long timescales greater than 10 days. As the data archive grows to provide light curves for 10(exp 5) stars of many years in length, Kepler will only fulfill its broad potential for stellar astrophysics if these systematics are understood and mitigated. Post-launch developments in the Kepler archive, data reduction pipeline and open source data analysis software have occurred to remove or reduce systematic artifacts. This paper provides a conceptual primer for users of the Kepler data archive to understand and recognize systematic artifacts within light curves and some methods for their removal. Specific examples of artifact mitigation are provided using data available within the archive. Through the methods defined here, the Kepler community will find a road map to maximizing the quality and employment of the Kepler legacy archive.
d'Entremont, Agnes G; Kolind, Shannon H; Mädler, Burkhard; Wilson, David R; MacKay, Alexander L
2014-03-01
To evaluate the effect of metal artifact reduction techniques on dGEMRIC T(1) calculation with surgical hardware present. We examined the effect of stainless-steel and titanium hardware on dGEMRIC T(1) maps. We tested two strategies to reduce metal artifact in dGEMRIC: (1) saturation recovery (SR) instead of inversion recovery (IR) and (2) applying the metal artifact reduction sequence (MARS), in a gadolinium-doped agarose gel phantom and in vivo with titanium hardware. T(1) maps were obtained using custom curve-fitting software and phantom ROIs were defined to compare conditions (metal, MARS, IR, SR). A large area of artifact appeared in phantom IR images with metal when T(I) ≤ 700 ms. IR maps with metal had additional artifact both in vivo and in the phantom (shifted null points, increased mean T(1) (+151 % IR ROI(artifact)) and decreased mean inversion efficiency (f; 0.45 ROI(artifact), versus 2 for perfect inversion)) compared to the SR maps (ROI(artifact): +13 % T(1) SR, 0.95 versus 1 for perfect excitation), however, SR produced noisier T(1) maps than IR (phantom SNR: 118 SR, 212 IR). MARS subtly reduced the extent of artifact in the phantom (IR and SR). dGEMRIC measurement in the presence of surgical hardware at 3T is possible with appropriately applied strategies. Measurements may work best in the presence of titanium and are severely limited with stainless steel. For regions near hardware where IR produces large artifacts making dGEMRIC analysis impossible, SR-MARS may allow dGEMRIC measurements. The position and size of the IR artifact is variable, and must be assessed for each implant/imaging set-up.
A Novel Method for Characterizing Beam Hardening Artifacts in Cone-beam Computed Tomographic Images.
Fox, Aaron; Basrani, Bettina; Kishen, Anil; Lam, Ernest W N
2018-05-01
The beam hardening (BH) artifact produced by root filling materials in cone-beam computed tomographic (CBCT) images is influenced by their radiologic K absorption edge values. The purpose of this study was to describe a novel technique to characterize BH artifacts in CBCT images produced by 3 root canal filling materials and to evaluate the effects of a zirconium (Zr)-based root filling material with a lower K edge (17.99 keV) on the production of BH artifacts. The palatal root canals of 3 phantom model teeth were prepared and root filled with gutta-percha (GP), a Zr root filling material, and calcium hydroxide paste. Each phantom tooth was individually imaged using the CS 9000 CBCT unit (Carestream, Atlanta, GA). The "light" and "dark" components of the BH artifacts were quantified separately using ImageJ software (National Institutes of Health, Bethesda, MD) in 3 regions of the root. Mixed-design analysis of variance was used to evaluate differences in the artifact area for the light and dark elements of the BH artifacts. A statistically significant difference in the area of the dark portion of the BH artifact was found between all fill materials and in all regions of the phantom tooth root (P < .05). GP generated a significantly greater dark but not light artifact area compared with Zr (P < .05). Moreover, statistically significant differences between the areas of both the light and dark artifacts were observed within all regions of the tooth root, with the greatest artifact being generated in the coronal third of the root (P < .001). Root canal filling materials with lower K edge material properties reduce BH artifacts along the entire length of the root canal and reduce the contribution of the dark artifact. Copyright © 2018 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
(Quickly) Testing the Tester via Path Coverage
NASA Technical Reports Server (NTRS)
Groce, Alex
2009-01-01
The configuration complexity and code size of an automated testing framework may grow to a point that the tester itself becomes a significant software artifact, prone to poor configuration and implementation errors. Unfortunately, testing the tester by using old versions of the software under test (SUT) may be impractical or impossible: test framework changes may have been motivated by interface changes in the tested system, or fault detection may become too expensive in terms of computing time to justify running until errors are detected on older versions of the software. We propose the use of path coverage measures as a "quick and dirty" method for detecting many faults in complex test frameworks. We also note the possibility of using techniques developed to diversify state-space searches in model checking to diversify test focus, and an associated classification of tester changes into focus-changing and non-focus-changing modifications.
NASAL-Geom, a free upper respiratory tract 3D model reconstruction software
NASA Astrophysics Data System (ADS)
Cercos-Pita, J. L.; Cal, I. R.; Duque, D.; de Moreta, G. Sanjuán
2018-02-01
The tool NASAL-Geom, a free upper respiratory tract 3D model reconstruction software, is here described. As a free software, researchers and professionals are welcome to obtain, analyze, improve and redistribute it, potentially increasing the rate of development, and reducing at the same time ethical conflicts regarding medical applications which cannot be analyzed. Additionally, the tool has been optimized for the specific task of reading upper respiratory tract Computerized Tomography scans, and producing 3D geometries. The reconstruction process is divided into three stages: preprocessing (including Metal Artifact Reduction, noise removal, and feature enhancement), segmentation (where the nasal cavity is identified), and 3D geometry reconstruction. The tool has been automatized (i.e. no human intervention is required) a critical feature to avoid bias in the reconstructed geometries. The applied methodology is discussed, as well as the program robustness and precision.
NASA Technical Reports Server (NTRS)
McComas, David; Stark, Michael; Leake, Stephen; White, Michael; Morisio, Maurizio; Travassos, Guilherme H.; Powers, Edward I. (Technical Monitor)
2000-01-01
The NASA Goddard Space Flight Center Flight Software Branch (FSB) is developing a Guidance, Navigation, and Control (GNC) Flight Software (FSW) product line. The demand for increasingly more complex flight software in less time while maintaining the same level of quality has motivated us to look for better FSW development strategies. The GNC FSW product line has been planned to address the core GNC FSW functionality very similar on many recent low/near Earth missions in the last ten years. Unfortunately these missions have not accomplished significant drops in development cost since a systematic approach towards reuse has not been adopted. In addition, new demands are continually being placed upon the FSW which means the FSB must become more adept at providing GNC FSW functionality's core so it can accommodate additional requirements. These domain features together with engineering concepts are influencing the specification, description and evaluation of FSW product line. Domain engineering is the foundation for emerging product line software development approaches. A product line is 'A family of products designed to take advantage of their common aspects and predicted variabilities'. In our product line approach, domain engineering includes the engineering activities needed to produce reusable artifacts for a domain. Application engineering refers to developing an application in the domain starting from reusable artifacts. The focus of this paper is regarding the software process, lessons learned and on how the GNC FSW product line manages variability. Existing domain engineering approaches do not enforce any specific notation for domain analysis or commonality and variability analysis. Usually, natural language text is the preferred tool. The advantage is the flexibility and adapt ability of natural language. However, one has to be ready to accept also its well-known drawbacks, such as ambiguity, inconsistency, and contradictions. While most domain analysis approaches are functionally oriented, the idea of applying the object-oriented approach in domain analysis is not new. Some authors propose to use UML as the notation underlying domain analysis. Our work is based on the same idea of merging UML and domain analysis. Further, we propose a few extensions to UML in order to express variability, and we define precisely their semantics so that a tool can support them. The extensions are designed to be implemented on the API of a popular industrial CASE tool, with obvious advantages in cost and availability of tool support. The paper outlines the product line processes and identifies where variability must be addressed. Then it describes the product line products with respect to how they accommodate variability. The Celestial Body subdomain is used as a working example. Our results to date are summarized and plans for the future are described.
Cardiac gating with a pulse oximeter for dual-energy imaging.
Shkumat, N A; Siewerdsen, J H; Dhanantwari, A C; Williams, D B; Paul, N S; Yorkston, J; Van Metter, R
2008-11-07
The development and evaluation of a prototype cardiac gating system for double-shot dual-energy (DE) imaging is described. By acquiring both low- and high-kVp images during the resting phase of the cardiac cycle (diastole), heart misalignment between images can be reduced, thereby decreasing the magnitude of cardiac motion artifacts. For this initial implementation, a fingertip pulse oximeter was employed to measure the peripheral pulse waveform ('plethysmogram'), offering potential logistic, cost and workflow advantages compared to an electrocardiogram. A gating method was developed that accommodates temporal delays due to physiological pulse propagation, oximeter waveform processing and the imaging system (software, filter-wheel, anti-scatter Bucky-grid and flat-panel detector). Modeling the diastolic period allowed the calculation of an implemented delay, t(imp), required to trigger correctly during diastole at any patient heart rate (HR). The model suggests a triggering scheme characterized by two HR regimes, separated by a threshold, HR(thresh). For rates at or below HR(thresh), sufficient time exists to expose on the same heartbeat as the plethysmogram pulse [t(imp)(HR) = 0]. Above HR(thresh), a characteristic t(imp)(HR) delays exposure to the subsequent heartbeat, accounting for all fixed and variable system delays. Performance was evaluated in terms of accuracy and precision of diastole-trigger coincidence and quantitative evaluation of artifact severity in gated and ungated DE images. Initial implementation indicated 85% accuracy in diastole-trigger coincidence. Through the identification of an improved HR estimation method (modified temporal smoothing of the oximeter waveform), trigger accuracy of 100% could be achieved with improved precision. To quantify the effect of the gating system on DE image quality, human observer tests were conducted to measure the magnitude of cardiac artifact under conditions of successful and unsuccessful diastolic gating. Six observers independently measured the artifact in 111 patient DE images. The data indicate that successful diastolic gating results in a statistically significant reduction (p < 0.001) in the magnitude of cardiac motion artifact, with residual artifact attributed primarily to gross patient motion.
Yuan, Fu-song; Sun, Yu-chun; Xie, Xiao-yan; Wang, Yong; Lv, Pei-jun
2013-12-18
To quantitatively evaluate the artifacts appearance of eight kinds of common dental restorative materials, such as zirconia. For the full-crown tooth preparation of mandibular first molar, eight kinds of full-crowns, such as zirconia all-ceramic crown, glass ceramic crown, ceramage crown, Au-Pt based porcelain-fused-metal (PFM) crown, Pure Titanium PFM crown, Co-Cr PFM crown, Ni-Cr PFM crown, and Au-Pd metal crown were fabricated. And natural teeth in vitro were used as controls. These full-crown and natural teeth in vitro were mounted an ultraviolet-curable resin fixed plate. High resolution cone beam computed tomography (CBCT) was used to scan all of the crowns and natural teeth in vitro, and their DICOM data were imported into software MIMICS 10.0. Then, the number of stripes and the maximum diameters of artifacts around the full-crowns were evaluated quantitatively in two-dimensional tomography images. In the two-dimensional tomography images,the artifacts did not appear around the natural teeth in vitro, glass ceramic crown, and ceramage crown. But thr artifacts appeared around the zirconia all-ceramic and metal crown. The number of stripes of artifacts was five to nine per one crown. The maximum diameters of the artifacts were 2.4 to 2.6 cm and 2.2 to 2.7 cm. In the two-dimensional tomography images of CBCT, stripe-like and radical artifacts were caused around the zirconia all-ceramic crown and metal based porcelain-fused-metal crowns. These artifacts could lower the imaging quality of the full crown shape greatly. The artifact was not caused around the natural teeth in vitro, glass ceramic crown, and ceramage crown.
Reduction of artifacts in computer simulation of breast Cooper's ligaments
NASA Astrophysics Data System (ADS)
Pokrajac, David D.; Kuperavage, Adam; Maidment, Andrew D. A.; Bakic, Predrag R.
2016-03-01
Anthropomorphic software breast phantoms have been introduced as a tool for quantitative validation of breast imaging systems. Efficacy of the validation results depends on the realism of phantom images. The recursive partitioning algorithm based upon the octree simulation has been demonstrated as versatile and capable of efficiently generating large number of phantoms to support virtual clinical trials of breast imaging. Previously, we have observed specific artifacts, (here labeled "dents") on the boundaries of simulated Cooper's ligaments. In this work, we have demonstrated that these "dents" result from the approximate determination of the closest simulated ligament to an examined subvolume (i.e., octree node) of the phantom. We propose a modification of the algorithm that determines the closest ligament by considering a pre-specified number of neighboring ligaments selected based upon the functions that govern the shape of ligaments simulated in the subvolume. We have qualitatively and quantitatively demonstrated that the modified algorithm can lead to elimination or reduction of dent artifacts in software phantoms. In a proof-of concept example, we simulated a 450 ml phantom with 333 compartments at 100 micrometer resolution. After the proposed modification, we corrected 148,105 dents, with an average size of 5.27 voxels (5.27nl). We have also qualitatively analyzed the corresponding improvement in the appearance of simulated mammographic images. The proposed algorithm leads to reduction of linear and star-like artifacts in simulated phantom projections, which can be attributed to dents. Analysis of a larger number of phantoms is ongoing.
NASA Astrophysics Data System (ADS)
Al, Can Mert; Yaman, Ulas
2018-05-01
In the scope of this study, an alternative automated method to the conventional design and fabrication pipeline of 3D printers is developed by using an integrated CAD/CAE/CAM approach. It increases the load carrying capacity of the parts by constructing heterogeneous infill structures. Traditional CAM software of Additive Manufacturing machinery starts with a design model in STL file format which only includes data about the outer boundary in the triangular mesh form. Depending on the given infill percentage, the algorithm running behind constructs the interior of the artifact by using homogeneous infill structures. As opposed to the current CAM software, the proposed method provides a way to construct heterogeneous infill structures with respect to the Von Misses stress field results obtained from a finite element analysis. Throughout the work, Rhinoceros3D is used for the design of the parts along with Grasshopper3D, an algorithmic design tool for Rhinoceros3D. In addition, finite element analyses are performed using Karamba3D, a plug-in for Grasshopper3D. According to the results of the tensile tests, the method offers an improvement of load carrying capacity about 50% compared to traditional slicing algorithms of 3D printing.
ERIC Educational Resources Information Center
Benoit-Barne, Chantal
2007-01-01
This essay investigates the rhetorical practices of socio-technical deliberation about free and open source (F/OS) software, providing support for the idea that a public sphere is a socio-technical ensemble that is discursive and fluid, yet tangible and organized because it is enacted by both humans and non-humans. In keeping with the empirical…
Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification
NASA Technical Reports Server (NTRS)
Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand;
2016-01-01
The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.
The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft
NASA Technical Reports Server (NTRS)
McComas, David; Wilmot, Jonathan; Cudmore, Alan
2016-01-01
In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.
Rogasch, Nigel C; Sullivan, Caley; Thomson, Richard H; Rose, Nathan S; Bailey, Neil W; Fitzgerald, Paul B; Farzan, Faranak; Hernandez-Pavon, Julio C
2017-02-15
The concurrent use of transcranial magnetic stimulation with electroencephalography (TMS-EEG) is growing in popularity as a method for assessing various cortical properties such as excitability, oscillations and connectivity. However, this combination of methods is technically challenging, resulting in artifacts both during recording and following typical EEG analysis methods, which can distort the underlying neural signal. In this article, we review the causes of artifacts in EEG recordings resulting from TMS, as well as artifacts introduced during analysis (e.g. as the result of filtering over high-frequency, large amplitude artifacts). We then discuss methods for removing artifacts, and ways of designing pipelines to minimise analysis-related artifacts. Finally, we introduce the TMS-EEG signal analyser (TESA), an open-source extension for EEGLAB, which includes functions that are specific for TMS-EEG analysis, such as removing and interpolating the TMS pulse artifact, removing and minimising TMS-evoked muscle activity, and analysing TMS-evoked potentials. The aims of TESA are to provide users with easy access to current TMS-EEG analysis methods and to encourage direct comparisons of these methods and pipelines. It is hoped that providing open-source functions will aid in both improving and standardising analysis across the field of TMS-EEG research. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel; Morasca, Sandro; Basili, Victor R.
1995-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1997-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.
A Scientific Software Product Line for the Bioinformatics domain.
Costa, Gabriella Castro B; Braga, Regina; David, José Maria N; Campos, Fernanda
2015-08-01
Most specialized users (scientists) that use bioinformatics applications do not have suitable training on software development. Software Product Line (SPL) employs the concept of reuse considering that it is defined as a set of systems that are developed from a common set of base artifacts. In some contexts, such as in bioinformatics applications, it is advantageous to develop a collection of related software products, using SPL approach. If software products are similar enough, there is the possibility of predicting their commonalities, differences and then reuse these common features to support the development of new applications in the bioinformatics area. This paper presents the PL-Science approach which considers the context of SPL and ontology in order to assist scientists to define a scientific experiment, and to specify a workflow that encompasses bioinformatics applications of a given experiment. This paper also focuses on the use of ontologies to enable the use of Software Product Line in biological domains. In the context of this paper, Scientific Software Product Line (SSPL) differs from the Software Product Line due to the fact that SSPL uses an abstract scientific workflow model. This workflow is defined according to a scientific domain and using this abstract workflow model the products (scientific applications/algorithms) are instantiated. Through the use of ontology as a knowledge representation model, we can provide domain restrictions as well as add semantic aspects in order to facilitate the selection and organization of bioinformatics workflows in a Scientific Software Product Line. The use of ontologies enables not only the expression of formal restrictions but also the inferences on these restrictions, considering that a scientific domain needs a formal specification. This paper presents the development of the PL-Science approach, encompassing a methodology and an infrastructure, and also presents an approach evaluation. This evaluation presents case studies in bioinformatics, which were conducted in two renowned research institutions in Brazil. Copyright © 2015 Elsevier Inc. All rights reserved.
Ali, Amir Monir
2018-01-01
The aim of the study was to evaluate the commercially available orthopedic metal artifact reduction (OMAR) technique in postoperative three-dimensional computed tomography (3DCT) reconstruction studies after spinal instrumentation and to investigate its clinical application. One hundred and twenty (120) patients with spinal metallic implants were included in the study. All had 3DCT reconstruction examinations using the OMAR software after obtaining the informed consents and approval of the Institution Ethical Committee. The degree of the artifacts, the related muscular density, the clearness of intermuscular fat planes, and definition of the adjacent vertebrae were qualitatively evaluated. The diagnostic satisfaction and quality of the 3D reconstruction images were thoroughly assessed. The majority (96.7%) of 3DCT reconstruction images performed were considered satisfactory to excellent for diagnosis. Only 3.3% of the reconstructed images had rendered unacceptable diagnostic quality. OMAR can effectively reduce metallic artifacts in patients with spinal instrumentation with highly diagnostic 3DCT reconstruction images.
Space Telecommunications Radio System (STRS) Application Repository Design and Analysis
NASA Technical Reports Server (NTRS)
Handler, Louis M.
2013-01-01
The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.
Applying program comprehension techniques to improve software inspections
NASA Technical Reports Server (NTRS)
Rifkin, Stan; Deimel, Lionel
1994-01-01
Software inspections are widely regarded as a cost-effective mechanism for removing defects in software, though performing them does not always reduce the number of customer-discovered defects. We present a case study in which an attempt was made to reduce such defects through inspection training that introduced program comprehension ideas. The training was designed to address the problem of understanding the artifact being reviewed, as well as other perceived deficiencies of the inspection process itself. Measures, both formal and informal, suggest that explicit training in program understanding may improve inspection effectiveness.
An EEG Data Investigation Using Only Artifacts
2017-02-22
approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...mediation approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...contaminated. Having the spectral results flagged as containing an artifact, means that the consumer of the data has the freedom to decide how to
Perez-Garcia, H; Barquero, R
The correct determination and delineation of tumor/organ size is crucial in 2-D imaging in 131 I therapy. These images are usually obtained using a system composed of a Gamma camera and high-energy collimator, although the system can produce artifacts in the image. This article analyses these artifacts and describes a correction filter that can eliminate those collimator artifacts. Using free software, ImageJ, a central profile in the image is obtained and analyzed. Two components can be seen in the fluctuation of the profile: one associated with the stochastic nature of the radiation, plus electronic noise and the other periodically across the position in space due to the collimator. These frequencies are analytically obtained and compared with the frequencies in the Fourier transform of the profile. A specially developed filter removes the artifacts in the 2D Fourier transform of the DICOM image. This filter is tested using a 15-cm-diameter Petri dish with 131 I radioactive water (big object size) image, a 131 I clinical pill (small object size) image, and an image of the remainder of the lesion of two patients treated with 3.7GBq (100mCi), and 4.44GBq (120mCi) of 131 I, respectively, after thyroidectomy. The artifact is due to the hexagonal periodic structure of the collimator. The use of the filter on large-sized images reduces the fluctuation by 5.8-3.5%. In small-sized images, the FWHM can be determined in the filtered image, while this is impossible in the unfiltered image. The definition of tumor boundary and the visualization of the activity distribution inside patient lesions improve drastically when the filter is applied to the corresponding images obtained with HE gamma camera. The HURRA filter removes the artifact of high-energy collimator artifacts in planar images obtained with a Gamma camera without reducing the image resolution. It can be applied in any study of patient quantification because the number of counts remains invariant. The filter makes possible the definition and delimitation of small uptakes, such as those presented in treatments with 131 I. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
Sack, Lawren; Caringella, Marissa; Scoffoni, Christine; Mason, Chase; Rawls, Michael; Markesteijn, Lars; Poorter, Lourens
2014-10-01
Leaf vein length per unit leaf area (VLA; also known as vein density) is an important determinant of water and sugar transport, photosynthetic function, and biomechanical support. A range of software methods are in use to visualize and measure vein systems in cleared leaf images; typically, users locate veins by digital tracing, but recent articles introduced software by which users can locate veins using thresholding (i.e. based on the contrasting of veins in the image). Based on the use of this method, a recent study argued against the existence of a fixed VLA value for a given leaf, proposing instead that VLA increases with the magnification of the image due to intrinsic properties of the vein system, and recommended that future measurements use a common, low image magnification for measurements. We tested these claims with new measurements using the software LEAFGUI in comparison with digital tracing using ImageJ software. We found that the apparent increase of VLA with magnification was an artifact of (1) using low-quality and low-magnification images and (2) errors in the algorithms of LEAFGUI. Given the use of images of sufficient magnification and quality, and analysis with error-free software, the VLA can be measured precisely and accurately. These findings point to important principles for improving the quantity and quality of important information gathered from leaf vein systems. © 2014 American Society of Plant Biologists. All Rights Reserved.
User Interactive Software for Analysis of Human Physiological Data
NASA Technical Reports Server (NTRS)
Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta
2006-01-01
Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration
Improvement in Recursive Hierarchical Segmentation of Data
NASA Technical Reports Server (NTRS)
Tilton, James C.
2006-01-01
A further modification has been made in the algorithm and implementing software reported in Modified Recursive Hierarchical Segmentation of Data (GSC- 14681-1), NASA Tech Briefs, Vol. 30, No. 6 (June 2006), page 51. That software performs recursive hierarchical segmentation of data having spatial characteristics (e.g., spectral-image data). The output of a prior version of the software contained artifacts, including spurious segmentation-image regions bounded by processing-window edges. The modification for suppressing the artifacts, mentioned in the cited article, was addition of a subroutine that analyzes data in the vicinities of seams to find pairs of regions that tend to lie adjacent to each other on opposite sides of the seams. Within each such pair, pixels in one region that are more similar to pixels in the other region are reassigned to the other region. The present modification provides for a parameter ranging from 0 to 1 for controlling the relative priority of merges between spatially adjacent and spatially non-adjacent regions. At 1, spatially-adjacent-/spatially- non-adjacent-region merges have equal priority. At 0, only spatially-adjacent-region merges (no spectral clustering) are allowed. Between 0 and 1, spatially-adjacent- region merges have priority over spatially- non-adjacent ones.
NASA Astrophysics Data System (ADS)
Yuan, Fusong; Lv, Peijun; Yang, Huifang; Wang, Yong; Sun, Yuchun
2015-07-01
Objectives: Based on the pixel gray value measurements, establish a beam-hardening artifacts index of the cone-beam CT tomographic image, and preliminarily evaluate its applicability. Methods: The 5mm-diameter metal ball and resin ball were fixed on the light-cured resin base plate respectively, while four vitro molars were fixed above and below the ball, on the left and right respectively, which have 10mm distance with the metal ball. Then, cone beam CT was used to scan the fixed base plate twice. The same layer tomographic images were selected from the two data and imported into the Photoshop software. The circle boundary was built through the determination of the center and radius of the circle, according to the artifact-free images section. Grayscale measurement tools were used to measure the internal boundary gray value G0, gray value G1 and G2 of 1mm and 20mm artifacts outside the circular boundary, the length L1 of the arc with artifacts in the circular boundary, the circumference L2. Hardening artifacts index was set A = (G1 / G0) * 0.5 + (G2 / G1) * 0.4 + (L2 / L1) * 0.1. Then, the A values of metal and resin materials were calculated respectively. Results: The A value of cobalt-chromium alloy material is 1, and resin material is 0. Conclusion: The A value reflects comprehensively the three factors of hardening artifacts influencing normal oral tissue image sharpness of cone beam CT. The three factors include relative gray value, the decay rate and range of artifacts.
Rolston, John D.; Gross, Robert E.; Potter, Steve M.
2009-01-01
Commercially available data acquisition systems for multielectrode recording from freely moving animals are expensive, often rely on proprietary software, and do not provide detailed, modifiable circuit schematics. When used in conjunction with electrical stimulation, they are prone to prolonged, saturating stimulation artifacts that prevent the recording of short-latency evoked responses. Yet electrical stimulation is integral to many experimental designs, and critical for emerging brain-computer interfacing and neuroprosthetic applications. To address these issues, we developed an easy-to-use, modifiable, and inexpensive system for multielectrode neural recording and stimulation. Setup costs are less than US$10,000 for 64 channels, an order of magnitude lower than comparable commercial systems. Unlike commercial equipment, the system recovers rapidly from stimulation and allows short-latency action potentials (<1 ms post-stimulus) to be detected, facilitating closed-loop applications and exposing neural activity that would otherwise remain hidden. To illustrate this capability, evoked activity from microstimulation of the rodent hippocampus is presented. System noise levels are similar to existing platforms, and extracellular action potentials and local field potentials can be recorded simultaneously. The system is modular, in banks of 16 channels, and flexible in usage: while primarily designed for in vivo use, it can be combined with commercial preamplifiers to record from in vitro multielectrode arrays. The system's open-source control software, NeuroRighter, is implemented in C#, with an easy-to-use graphical interface. As C# functions in a managed code environment, which may impact performance, analysis was conducted to ensure comparable speed to C++ for this application. Hardware schematics, layout files, and software are freely available. Since maintaining wired headstage connections with freely moving animals is difficult, we describe a new method of electrode-headstage coupling using neodymium magnets. PMID:19668698
Detecting Inconsistencies in Multi-View Models with Variability
NASA Astrophysics Data System (ADS)
Lopez-Herrejon, Roberto Erick; Egyed, Alexander
Multi-View Modeling (MVM) is a common modeling practice that advocates the use of multiple, different and yet related models to represent the needs of diverse stakeholders. Of crucial importance in MVM is consistency checking - the description and verification of semantic relationships amongst the views. Variability is the capacity of software artifacts to vary, and its effective management is a core tenet of the research in Software Product Lines (SPL). MVM has proven useful for developing one-of-a-kind systems; however, to reap the potential benefits of MVM in SPL it is vital to provide consistency checking mechanisms that cope with variability. In this paper we describe how to address this need by applying Safe Composition - the guarantee that all programs of a product line are type safe. We evaluate our approach with a case study.
Holographic radar imaging privacy techniques utilizing dual-frequency implementation
NASA Astrophysics Data System (ADS)
McMakin, Douglas L.; Hall, Thomas E.; Sheen, David M.
2008-04-01
Over the last 15 years, the Pacific Northwest National Laboratory has performed significant research and development activities to enhance the state of the art of holographic radar imaging systems to be used at security checkpoints for screening people for concealed threats hidden under their garments. These enhancement activities included improvements to privacy techniques to remove human features and providing automatic detection of body-worn concealed threats. The enhanced privacy and detection methods used both physical and software imaging techniques. The physical imaging techniques included polarization-diversity illumination and reception, dual-frequency implementation, and high-frequency imaging at 60 GHz. Software imaging techniques to enhance the privacy of the person under surveillance included extracting concealed threat artifacts from the imagery to automatically detect the threat. This paper will focus on physical privacy techniques using dual-frequency implementation.
Holographic Radar Imaging Privacy Techniques Utilizing Dual-Frequency Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMakin, Douglas L.; Hall, Thomas E.; Sheen, David M.
2008-04-18
Over the last 15 years, the Pacific Northwest National Laboratory has performed significant research and development activities to enhance the state of the art of holographic radar imaging systems to be used at security checkpoints for screening people for concealed threats hidden under their garments. These enhancement activities included improvements to privacy techniques to remove human features and providing automatic detection of body-worn concealed threats. The enhanced privacy and detection methods used both physical and software imaging techniques. The physical imaging techniques included polarization-diversity illumination and reception, dual-frequency implementation, and high-frequency imaging at 60 GHz. Software imaging techniques to enhancemore » the privacy of the person under surveillance included extracting concealed threat artifacts from the imagery to automatically detect the threat. This paper will focus on physical privacy techniques using dual-frequency implementation.« less
V & V Within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.
Software for pre-processing Illumina next-generation sequencing short read sequences
2014-01-01
Background When compared to Sanger sequencing technology, next-generation sequencing (NGS) technologies are hindered by shorter sequence read length, higher base-call error rate, non-uniform coverage, and platform-specific sequencing artifacts. These characteristics lower the quality of their downstream analyses, e.g. de novo and reference-based assembly, by introducing sequencing artifacts and errors that may contribute to incorrect interpretation of data. Although many tools have been developed for quality control and pre-processing of NGS data, none of them provide flexible and comprehensive trimming options in conjunction with parallel processing to expedite pre-processing of large NGS datasets. Methods We developed ngsShoRT (next-generation sequencing Short Reads Trimmer), a flexible and comprehensive open-source software package written in Perl that provides a set of algorithms commonly used for pre-processing NGS short read sequences. We compared the features and performance of ngsShoRT with existing tools: CutAdapt, NGS QC Toolkit and Trimmomatic. We also compared the effects of using pre-processed short read sequences generated by different algorithms on de novo and reference-based assembly for three different genomes: Caenorhabditis elegans, Saccharomyces cerevisiae S288c, and Escherichia coli O157 H7. Results Several combinations of ngsShoRT algorithms were tested on publicly available Illumina GA II, HiSeq 2000, and MiSeq eukaryotic and bacteria genomic short read sequences with the focus on removing sequencing artifacts and low-quality reads and/or bases. Our results show that across three organisms and three sequencing platforms, trimming improved the mean quality scores of trimmed sequences. Using trimmed sequences for de novo and reference-based assembly improved assembly quality as well as assembler performance. In general, ngsShoRT outperformed comparable trimming tools in terms of trimming speed and improvement of de novo and reference-based assembly as measured by assembly contiguity and correctness. Conclusions Trimming of short read sequences can improve the quality of de novo and reference-based assembly and assembler performance. The parallel processing capability of ngsShoRT reduces trimming time and improves the memory efficiency when dealing with large datasets. We recommend combining sequencing artifacts removal, and quality score based read filtering and base trimming as the most consistent method for improving sequence quality and downstream assemblies. ngsShoRT source code, user guide and tutorial are available at http://research.bioinformatics.udel.edu/genomics/ngsShoRT/. ngsShoRT can be incorporated as a pre-processing step in genome and transcriptome assembly projects. PMID:24955109
Pessis, Eric; Campagna, Raphaël; Sverzut, Jean-Michel; Bach, Fabienne; Rodallec, Mathieu; Guerini, Henri; Feydy, Antoine; Drapé, Jean-Luc
2013-01-01
With arthroplasty being increasingly used to relieve joint pain, imaging of patients with metal implants can represent a significant part of the clinical work load in the radiologist's daily practice. Computed tomography (CT) plays an important role in the postoperative evaluation of patients who are suspected of having metal prosthesis-related problems such as aseptic loosening, bone resorption or osteolysis, infection, dislocation, metal hardware failure, or periprosthetic bone fracture. Despite advances in detector technology and computer software, artifacts from metal implants can seriously degrade the quality of CT images, sometimes to the point of making them diagnostically unusable. Several factors may help reduce the number and severity of artifacts at multidetector CT, including decreasing the detector collimation and pitch, increasing the kilovolt peak and tube charge, and using appropriate reconstruction algorithms and section thickness. More recently, dual-energy CT has been proposed as a means of reducing beam-hardening artifacts. The use of dual-energy CT scanners allows the synthesis of virtual monochromatic spectral (VMS) images. Monochromatic images depict how the imaged object would look if the x-ray source produced x-ray photons at only a single energy level. For this reason, VMS imaging is expected to provide improved image quality by reducing beam-hardening artifacts.
Eddy current compensation for delta relaxation enhanced MR by dynamic reference phase modulation.
Hoelscher, Uvo Christoph; Jakob, Peter M
2013-04-01
Eddy current compensation by dynamic reference phase modulation (eDREAM) is a compensation method for eddy current fields induced by B 0 field-cycling which occur in delta relaxation enhanced MR (dreMR) imaging. The presented method is based on a dynamic frequency adjustment and prevents eddy current related artifacts. It is easy to implement and can be completely realized in software for any imaging sequence. In this paper, the theory of eDREAM is derived and two applications are demonstrated. The theory describes how to model the behavior of the eddy currents and how to implement the compensation. Phantom and in vivo measurements are carried out and demonstrate the benefits of eDREAM. A comparison of images acquired with and without eDREAM shows a significant improvement in dreMR image quality. Images without eDREAM suffer from severe artifacts and do not allow proper interpretation while images with eDREAM are artifact free. In vivo experiments demonstrate that dreMR imaging without eDREAM is not feasible as artifacts completely change the image contrast. eDREAM is a flexible eddy current compensation for dreMR. It is capable of completely removing the influence of eddy currents such that the dreMR images do not suffer from artifacts.
Ahmed, Abdulghani Ali; Xue Li, Chua
2018-01-01
Cloud storage service allows users to store their data online, so that they can remotely access, maintain, manage, and back up data from anywhere via the Internet. Although helpful, this storage creates a challenge to digital forensic investigators and practitioners in collecting, identifying, acquiring, and preserving evidential data. This study proposes an investigation scheme for analyzing data remnants and determining probative artifacts in a cloud environment. Using pCloud as a case study, this research collected the data remnants available on end-user device storage following the storing, uploading, and accessing of data in the cloud storage. Data remnants are collected from several sources, including client software files, directory listing, prefetch, registry, network PCAP, browser, and memory and link files. Results demonstrate that the collected remnants data are beneficial in determining a sufficient number of artifacts about the investigated cybercrime. © 2017 American Academy of Forensic Sciences.
Towards a mature measurement environment: Creating a software engineering research environment
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1990-01-01
Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.
Feature-based component model for design of embedded systems
NASA Astrophysics Data System (ADS)
Zha, Xuan Fang; Sriram, Ram D.
2004-11-01
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
A taxonomy and discussion of software attack technologies
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
2005-03-01
Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research and a discussion of the motivation for our work. Section two contains a presents our taxonomy of software attacks and a discussion of the strategies employed and general weaknesses exploited for each attack. Section three contains a summary and suggestions for further research.
Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak
2016-01-01
Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.
Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak
2016-01-01
Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054
2011-05-27
frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207
Kirberger, R M; Roos, C J
1995-06-01
Radiographic artifacts commonly occur, particularly with hand processing. The artifacts may originate between the X-ray tube and the cassette as extraneous material on the patient or contamination of positioning aids, or result from debris within the cassette, or damage to, or staining of the screens. These artifacts are white to grey, may have a constant or different position on follow-up radiographs, and their size and shape are reflective of the inciting cause. A number of artifacts may occur in the darkroom during handling, developing, fixing and drying of the film. White to shiny artifacts are caused by the contamination of films with fixer, inability of developer to reach parts of the film or loss of emulsion from the developed film. Black artifacts result from improper handling or storage of films, resulting in exposure to light, or from pressure marks or static electricity discharges. Dropped levels of hand-processing chemicals may result in a variety of tide-marks on films. Most radiographic artifacts can be prevented by proper storage and handling of films and by optimal darkroom technique.
Mansouri, Kaweh; Medeiros, Felipe A.; Tatham, Andrew J.; Marchase, Nicholas; Weinreb, Robert N.
2017-01-01
PURPOSE To determine the repeatability of automated retinal and choroidal thickness measurements with swept-source optical coherence tomography (SS OCT) and the frequency and type of scan artifacts. DESIGN Prospective evaluation of new diagnostic technology. METHODS Thirty healthy subjects were recruited prospectively and underwent imaging with a prototype SS OCT instrument. Undilated scans of 54 eyes of 27 subjects (mean age, 35.1 ± 9.3 years) were obtained. Each subject had 4 SS OCT protocols repeated 3 times: 3-dimensional (3D) 6 × 6-mm raster scan of the optic disc and macula, radial, and line scan. Automated measurements were obtained through segmentation software. Interscan repeatability was assessed by intraclass correlation coefficients (ICCs). RESULTS ICCs for choroidal measurements were 0.92, 0.98, 0.80, and 0.91, respectively, for 3D macula, 3D optic disc, radial, and line scans. ICCs for retinal measurements were 0.39, 0.49, 0.71, and 0.69, respectively. Artifacts were present in up to 9% scans. Signal loss because of blinking was the most common artifact on 3D scans (optic disc scan, 7%; macula scan, 9%), whereas segmentation failure occurred in 4% of radial and 3% of line scans. When scans with image artifacts were excluded, ICCs for choroidal thickness increased to 0.95, 0.99, 0.87, and 0.93 for 3D macula, 3D optic disc, radial, and line scans, respectively. ICCs for retinal thickness increased to 0.88, 0.83, 0.89, and 0.76, respectively. CONCLUSIONS Improved repeatability of automated choroidal and retinal thickness measurements was found with the SS OCT after correction of scan artifacts. Recognition of scan artifacts is important for correct interpretation of SS OCT measurements. PMID:24531020
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
A novel approach to TEM preparation with a (7-axis stage) triple-beam FIB-SEM system
NASA Astrophysics Data System (ADS)
Clarke, Jamil J.
2015-10-01
Preparation of lamellae from bulk to grid for Cs-corrected Transmission Electron Microscope (TEM) observation has mostly become routine work on the latest FIB-SEM systems, with standardized techniques that often are left to automation for the initial steps. The finalization of lamellae however, has mostly become, non-routine, non-repeatable and often driven by user experience level in most cases to produce high quality damage-less cross section. Materials processing of the latest technologies, with ever-shrinking Nano-sized structures pose challenges to modern FIB-SEM systems. This can often lead to specialized techniques and hyper-specific functions for producing ultra-thin high quality lamellae that often are lab specific, preventing practical use of such techniques across multiple materials and applications. Several factors that should be incorporated in processing fine structured materials successfully include how the use of electron and ion scan conditions can affect a thin section during ion milling, the type of ion species applied for material processing during the finalization of lamellae with gallium ions or of a smaller ion species type such as Ar/Xe, sample orientation of the lamella during the thinning process which is linked to ion beam incident angle as a direct relationship in the creation of waterfall effects or curtain effects, and how software can be employed to aid in the reduction of these artifacts with reproducible results regardless of FIB-SEM experience for site-specific lift outs. A traditional TEM preparation was performed of a fine structure specimen in pursuit of a process technique to produce a high quality TEM lamella which would address all of the factors mentioned. These new capabilities have been refined and improved upon during the FIB-SEM design and development stages with an end result of a new approach that yields an improvement in quality by the reduction of common ion milling artifacts such as curtain effects, amorphous material, and better pin pointing of the area of interest while reducing overall processing time for the TEM sample preparation process and enhancing repeatability through ease of use via software controls. The development of these new technologies, incorporating a third Ar/Xe ion beam column in conjunction with the electron and gallium ion beam column, a 7-axis stage for enhanced sample orientation with tilt functions in two axes and automated swing control along with a host of additional functions which address the factors aforementioned such as electron and ion scan techniques and curtain effect removal by the use of hardware and software components that are key to reduce typical FIB related artifacts, all of which are called "ACE [Anti Curtaining Effect] Technologies" are explained. The overall developments of these technologies are to address a significant point that productivity, throughput and repeatability are comprised by synergy between the user, application, software and hardware within a FIB-SEM system. The latest Hitachi FIB-SEM platform offers these innovations for reliability, repeatability and high quality lamella preparation for Cs-corrected (S)TEMs.
Agile IT: Thinking in User-Centric Models
NASA Astrophysics Data System (ADS)
Margaria, Tiziana; Steffen, Bernhard
We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.
Real-time motion analytics during brain MRI improve data quality and reduce costs.
Dosenbach, Nico U F; Koller, Jonathan M; Earl, Eric A; Miranda-Dominguez, Oscar; Klein, Rachel L; Van, Andrew N; Snyder, Abraham Z; Nagel, Bonnie J; Nigg, Joel T; Nguyen, Annie L; Wesevich, Victoria; Greene, Deanna J; Fair, Damien A
2017-11-01
Head motion systematically distorts clinical and research MRI data. Motion artifacts have biased findings from many structural and functional brain MRI studies. An effective way to remove motion artifacts is to exclude MRI data frames affected by head motion. However, such post-hoc frame censoring can lead to data loss rates of 50% or more in our pediatric patient cohorts. Hence, many scanner operators collect additional 'buffer data', an expensive practice that, by itself, does not guarantee sufficient high-quality MRI data for a given participant. Therefore, we developed an easy-to-setup, easy-to-use Framewise Integrated Real-time MRI Monitoring (FIRMM) software suite that provides scanner operators with head motion analytics in real-time, allowing them to scan each subject until the desired amount of low-movement data has been collected. Our analyses show that using FIRMM to identify the ideal scan time for each person can reduce total brain MRI scan times and associated costs by 50% or more. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Operational Simulator for Small Satellites (NOS3)
NASA Technical Reports Server (NTRS)
Zemerick, Scott
2015-01-01
The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operationstraining, verification and validation (VV), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
Children as Educational Computer Game Designers: An Exploratory Study
ERIC Educational Resources Information Center
Baytak, Ahmet; Land, Susan M.; Smith, Brian K.
2011-01-01
This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…
ERIC Educational Resources Information Center
Mioduser, David; Levy, Sharona T.
2010-01-01
This study explores young children's ability to construct and explain adaptive behaviors of a behaving artifact, an autonomous mobile robot with sensors. A central component of the behavior construction environment is the RoboGan software that supports children's construction of spatiotemporal events with an a-temporal rule structure. Six…
NASA Technical Reports Server (NTRS)
1990-01-01
The papers presented at the conference on hypermedia and information reconstruction are compiled. The following subject areas are covered: hypertext, typographic man, and the notion of literacy; a knowledge base browser using hypermedia; Ai GERM - a logic programming front end for GERM; and HEAVENS system for software artifacts.
An interactive dynamic analysis and decision support software for MR mammography.
Ertaş, Gökhan; Gülçür, H Ozcan; Tunaci, Mehtap
2008-06-01
A fully automated software is introduced to facilitate MR mammography (MRM) examinations and overcome subjectiveness in diagnosis using normalized maximum intensity-time ratio (nMITR) maps. These maps inherently suppress enhancements due to normal parenchyma and blood vessels that surround lesions and have natural tolerance to small field inhomogeneities and motion artifacts. The classifier embedded within the software is trained with normalized complexity and maximum nMITR of 22 lesions and tested with the features of remaining 22 lesions. Achieved diagnostic performances are 92% sensitivity, 90% specificity, 91% accuracy, 92% positive predictive value and 90% negative predictive value. DynaMammoAnalyst shortens evaluation time considerably and reduces inter and intra-observer variability by providing decision support.
Program Facilitates CMMI Appraisals
NASA Technical Reports Server (NTRS)
Sweetser, Wesley
2005-01-01
A computer program has been written to facilitate appraisals according to the methodology of Capability Maturity Model Integration (CMMI). [CMMI is a government/industry standard, maintained by the Software Engineering Institute at Carnegie Mellon University, for objectively assessing the engineering capability and maturity of an organization (especially, an organization that produces software)]. The program assists in preparation for a CMMI appraisal by providing drop-down lists suggesting required artifacts or evidence. It identifies process areas for which similar evidence is required and includes a copy feature that reduces or eliminates repetitive data entry. It generates reports to show the entire framework for reference, the appraisal artifacts to determine readiness for an appraisal, and lists of interviewees and questions to ask them during the appraisal. During an appraisal, the program provides screens for entering observations and ratings, and reviewing evidence provided thus far. Findings concerning strengths and weaknesses can be exported for use in a report or a graphical presentation. The program generates a chart showing capability level ratings of the organization. A context-sensitive Windows help system enables a novice to use the program and learn about the CMMI appraisal process.
Certification of COTS Software in NASA Human Rated Flight Systems
NASA Technical Reports Server (NTRS)
Goforth, Andre
2012-01-01
Adoption of commercial off-the-shelf (COTS) products in safety critical systems has been seen as a promising acquisition strategy to improve mission affordability and, yet, has come with significant barriers and challenges. Attempts to integrate COTS software components into NASA human rated flight systems have been, for the most part, complicated by verification and validation (V&V) requirements necessary for flight certification per NASA s own standards. For software that is from COTS sources, and, in general from 3rd party sources, either commercial, government, modified or open source, the expectation is that it meets the same certification criteria as those used for in-house and that it does so as if it were built in-house. The latter is a critical and hidden issue. This paper examines the longstanding barriers and challenges in the use of 3rd party software in safety critical systems and cover recent efforts to use COTS software in NASA s Multi-Purpose Crew Vehicle (MPCV) project. It identifies some core artifacts that without them, the use of COTS and 3rd party software is, for all practical purposes, a nonstarter for affordable and timely insertion into flight critical systems. The paper covers the first use in a flight critical system by NASA of COTS software that has prior FAA certification heritage, which was shown to meet the RTCA-DO-178B standard, and how this certification may, in some cases, be leveraged to allow the use of analysis in lieu of testing. Finally, the paper proposes the establishment of an open source forum for development of safety critical 3rd party software.
Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro
2017-01-01
In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210
Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji
2017-01-01
In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.
SU-E-I-13: Evaluation of Metal Artifact Reduction (MAR) Software On Computed Tomography (CT) Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, V; Kohli, K
2015-06-15
Purpose: A new commercially available metal artifact reduction (MAR) software in computed tomography (CT) imaging was evaluated with phantoms in the presence of metals. The goal was to assess the ability of the software to restore the CT number in the vicinity of the metals without impacting the image quality. Methods: A Catphan 504 was scanned with a GE Optima RT 580 CT scanner (GE Healthcare, Milwaukee, WI) and the images were reconstructed with and without the MAR software. Both datasets were analyzed with Image Owl QA software (Image Owl Inc, Greenwich, NY). CT number sensitometry, MTF, low contrast, uniformity,more » noise and spatial accuracy were compared for scans with and without MAR software. In addition, an in-house made phantom was scanned with and without a stainless steel insert at three different locations. The accuracy of the CT number and metal insert dimension were investigated as well. Results: Comparisons between scans with and without MAR algorithm on the Catphan phantom demonstrate similar results for image quality. However, noise was slightly higher for the MAR algorithm. Evaluation of the CT number at various locations of the in-house made phantom was also performed. The baseline HU, obtained from the scan without metal insert, was compared to scans with the stainless steel insert at 3 different locations. The HU difference between the baseline scan versus metal scan was improved when the MAR algorithm was applied. In addition, the physical diameter of the stainless steel rod was over-estimated by the MAR algorithm by 0.9 mm. Conclusion: This work indicates with the presence of metal in CT scans, the MAR algorithm is capable of providing a more accurate CT number without compromising the overall image quality. Future work will include the dosimetric impact on the MAR algorithm.« less
Panning artifacts in digital pathology images
NASA Astrophysics Data System (ADS)
Avanaki, Ali R. N.; Lanciault, Christian; Espig, Kathryn S.; Xthona, Albert; Kimpe, Tom R. L.
2017-03-01
In making a pathologic diagnosis, a pathologist uses cognitive processes: perception, attention, memory, and search (Pena and Andrade-Filho, 2009). Typically, this involves focus while panning from one region of a slide to another, using either a microscope in a traditional workflow or software program and display in a digital pathology workflow (DICOM Standard Committee, 2010). We theorize that during panning operation, the pathologist receives information important to diagnosis efficiency and/or correctness. As compared to an optical microscope, panning in a digital pathology image involves some visual artifacts due to the following: (i) the frame rate is finite; (ii) time varying visual signals are reconstructed using imperfect zero-order hold. Specifically, after pixel's digital drive is changed, it takes time for a pixel to emit the expected amount of light. Previous work suggests that 49% of navigation is conducted in low-power/overview with digital pathology (Molin et al., 2015), but the influence of display factors has not been measured. We conducted a reader study to establish a relationship between display frame rate, panel response time, and threshold panning speed (above which the artifacts become noticeable). Our results suggest visual tasks that involve tissue structure are more impacted by the simulated panning artifacts than those that only involve color (e.g., staining intensity estimation), and that the panning artifacts versus normalized panning speed has a peak behavior which is surprising and may change for a diagnostic task. This is work in progress and our final findings should be considered in designing future digital pathology systems.
Wang, Yang; Qian, Bangping; Li, Baoxin; Qin, Guochu; Zhou, Zhengyang; Qiu, Yong; Sun, Xizhao; Zhu, Bin
2013-08-01
To evaluate the effectiveness of spectral CT in reducing metal artifacts caused by pedicle screws in patients with scoliosis. Institutional review committee approval and written informed consents from patients were obtained. 18 scoliotic patients with a total of 228 pedicle screws who underwent spectral CT imaging were included in this study. Monochromatic image sets with and without the additional metal artifacts reduction software (MARS) correction were generated with photon energy at 65keV and from 70 to 140keV with 10keV interval using the 80kVp and 140kVp projection sets. Polychromatic images corresponded to the conventional 140kVp imaging were also generated from the same scan data as a control group. Both objective evaluation (screw width and quantitative artifacts index measurements) and subjective evaluation (depiction of pedicle screws, surrounding structures and their relationship) were performed. Image quality of monochromatic images in the range from 110 to 140keV (0.97±0.28) was rated superior to the conventional polychromatic images (2.53±0.54) and also better than monochromatic images with lower energy. Images of energy above 100keV also give accurate measurement of the width of screws and relatively low artifacts index. The form of screws was slightly distorted in MARS reconstruction. Compared to conventional polychromatic images, monochromatic images acquired from dual-energy CT provided superior image quality with much reduced metal artifacts of pedicle screws in patients with scoliosis. Optimal energy range was found between 110 and 140keV. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Kubios HRV--heart rate variability analysis software.
Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A
2014-01-01
Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
The Value of Social Software in School Library Instruction, Communication, and Collaboration
ERIC Educational Resources Information Center
Summers, Laura L.
2009-01-01
As budget cuts loom in school districts across the nation, school librarians are expected to show artifacts and share data to cement their credibility as instructional leaders, since according to Zmuda (2007) and many others, the effectiveness of the school library media program must be measured by what students learn as a result of their…
SU-F-J-115: Target Volume and Artifact Evaluation of a New Device-Less 4D CT Algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, R; Pan, T
2016-06-15
Purpose: 4DCT is often used in radiation therapy treatment planning to define the extent of motion of the visible tumor (IGTV). Recent available software allows 4DCT images to be created without the use of an external motion surrogate. This study aims to compare this device-less algorithm to a standard device-driven technique (RPM) in regards to artifacts and the creation of treatment volumes. Methods: 34 lung cancer patients who had previously received a cine 4DCT scan on a GE scanner with an RPM determined respiratory signal were selected. Cine images were sorted into 10 phases based on both the RPM signalmore » and the device-less algorithm. Contours were created on standard and device-less maximum intensity projection (MIP) images using a region growing algorithm and manual adjustment to remove other structures. Variations in measurements due to intra-observer differences in contouring were assessed by repeating a subset of 6 patients 2 additional times. Artifacts in each phase image were assessed using normalized cross correlation at each bed position transition. A score between +1 (artifacts “better” in all phases for device-less) and −1 (RPM similarly better) was assigned for each patient based on these results. Results: Device-less IGTV contours were 2.1 ± 1.0% smaller than standard IGTV contours (not significant, p = 0.15). The Dice similarity coefficient (DSC) was 0.950 ± 0.006 indicating good similarity between the contours. Intra-observer variation resulted in standard deviations of 1.2 percentage points in percent volume difference and 0.005 in DSC measurements. Only two patients had improved artifacts with RPM, and the average artifact score (0.40) was significantly greater than zero. Conclusion: Device-less 4DCT can be used in place of the standard method for target definition due to no observed difference between standard and device-less IGTVs. Phase image artifacts were significantly reduced with the device-less method.« less
NASA Astrophysics Data System (ADS)
Carey, Austin M.; Paige, Ginger B.; Carr, Bradley J.; Dogan, Mine
2017-10-01
Time-lapse electrical resistivity tomography (ERT) is commonly used as a minimally invasive tool to study infiltration processes. In 2014, we conducted field studies coupling variable intensity rainfall simulation with high-resolution ERT to study the real-time partitioning of rainfall into surface and subsurface response. The significant contrast in resistivity in the subsurface from large changes in subsurface moisture resulted in artifacts during the inversion process of the time-lapse ERT data collected using a dipole-dipole electrode array. These artifacts, which are not representative of real subsurface moisture dynamics, have been shown to arise during time-lapse inversion of ERT data and may be subject to misinterpretation. Forward modeling of the infiltration process post field experiments using a two-layer system (saprolite overlain by a soil layer) was used to generate synthetic datasets. The synthetic data were used to investigate the influence of both changes in volumetric moisture content and electrode configuration on the development of the artifacts identified in the field datasets. For the dipole-dipole array, we found that a decrease in the resistivity of the bottom layer by 67% resulted in a 50% reduction in artifact development. Artifacts for the seven additional array configurations tested, ranged from a 19% increase in artifact development (using an extended dipole-dipole array) to as much as a 96% decrease in artifact development (using a wenner-alpha array), compared to that of the dipole-dipole array. Moreover, these arrays varied in their ability to accurately delineate the infiltration front. Model results showed that the modified pole-dipole array was able to accurately image the infiltration zone and presented fewer artifacts for our experiments. In this study, we identify an optimal array type for imaging rainfall-infiltration dynamics that reduces artifacts. The influence of moisture contrast between the infiltrating water and the bulk subsurface material was characterized and shown to be a major factor in contributing to artifact development. Through forward modeling, this study highlights the importance of considering array type and subsurface moisture conditions when using time-lapse resistivity to obtain reliable estimates of vadose zone flow processes during rainfall-infiltration events.
Teaching and Learning the Nature of Technical Artifacts
ERIC Educational Resources Information Center
Frederik, Ineke; Sonneveld, Wim; de Vries, Marc J.
2011-01-01
Artifacts are probably our most obvious everyday encounter with technology. Therefore, a good understanding of the nature of technical artifacts is a relevant part of technological literacy. In this article we draw from the philosophy of technology to develop a conceptualization of technical artifacts that can be used for educational purposes.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, J; Followill, D; Howell, R
2015-06-15
Purpose: To investigate two strategies for reducing dose calculation errors near metal implants: use of CT metal artifact reduction methods and implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) method. Methods: Radiochromic film was used to measure the dose upstream and downstream of titanium and Cerrobend implants. To assess the dosimetric impact of metal artifact reduction methods, dose calculations were performed using baseline, uncorrected images and metal artifact reduction Methods: Philips O-MAR, GE’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI imaging with metal artifact reduction software applied (MARs).To assess the impact of metal kernels, titaniummore » and silver kernels were implemented into a commercial collapsed cone C/S algorithm. Results: The CT artifact reduction methods were more successful for titanium than Cerrobend. Interestingly, for beams traversing the metal implant, we found that errors in the dimensions of the metal in the CT images were more important for dose calculation accuracy than reduction of imaging artifacts. The MARs algorithm caused a distortion in the shape of the titanium implant that substantially worsened the calculation accuracy. In comparison to water kernel dose calculations, metal kernels resulted in better modeling of the increased backscatter dose at the upstream interface but decreased accuracy directly downstream of the metal. We also found that the success of metal kernels was dependent on dose grid size, with smaller calculation voxels giving better accuracy. Conclusion: Our study yielded mixed results, with neither the metal artifact reduction methods nor the metal kernels being globally effective at improving dose calculation accuracy. However, some successes were observed. The MARs algorithm decreased errors downstream of Cerrobend by a factor of two, and metal kernels resulted in more accurate backscatter dose upstream of metals. Thus, these two strategies do have the potential to improve accuracy for patients with metal implants in certain scenarios. This work was supported by Public Health Service grants CA 180803 and CA 10953 awarded by the National Cancer Institute, United States of Health and Human Services, and in part by Mobius Medical Systems.« less
MorphoGraphX: A platform for quantifying morphogenesis in 4D.
Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne H K; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S
2015-05-06
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX ( www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth.
An overview of turbulence compensation
NASA Astrophysics Data System (ADS)
Schutte, Klamer; van Eekeren, Adam W. M.; Dijk, Judith; Schwering, Piet B. W.; van Iersel, Miranda; Doelman, Niek J.
2012-09-01
In general, long range visual detection, recognition and identification are hampered by turbulence caused by atmospheric conditions. Much research has been devoted to the field of turbulence compensation. One of the main advantages of turbulence compensation is that it enables visual identification over larger distances. In many (military) scenarios this is of crucial importance. In this paper we give an overview of several software and hardware approaches to compensate for the visual artifacts caused by turbulence. These approaches are very diverse and range from the use of dedicated hardware, such as adaptive optics, to the use of software methods, such as deconvolution and lucky imaging. For each approach the pros and cons are given and it is indicated for which type of scenario this approach is useful. In more detail we describe the turbulence compensation methods TNO has developed in the last years and place them in the context of the different turbulence compensation approaches and TNO's turbulence compensation roadmap. Furthermore we look forward and indicate the upcoming challenges in the field of turbulence compensation.
Turbulence compensation: an overview
NASA Astrophysics Data System (ADS)
van Eekeren, Adam W. M.; Schutte, Klamer; Dijk, Judith; Schwering, Piet B. W.; van Iersel, Miranda; Doelman, Niek J.
2012-06-01
In general, long range visual detection, recognition and identification are hampered by turbulence caused by atmospheric conditions. Much research has been devoted to the field of turbulence compensation. One of the main advantages of turbulence compensation is that it enables visual identification over larger distances. In many (military) scenarios this is of crucial importance. In this paper we give an overview of several software and hardware approaches to compensate for the visual artifacts caused by turbulence. These approaches are very diverse and range from the use of dedicated hardware, such as adaptive optics, to the use of software methods, such as deconvolution and lucky imaging. For each approach the pros and cons are given and it is indicated for which scenario this approach is useful. In more detail we describe the turbulence compensation methods TNO has developed in the last years and place them in the context of the different turbulence compensation approaches and TNO's turbulence compensation roadmap. Furthermore we look forward and indicate the upcoming challenges in the field of turbulence compensation.
Somatic Point Mutation Calling in Low Cellularity Tumors
Kassahn, Karin S.; Holmes, Oliver; Nones, Katia; Patch, Ann-Marie; Miller, David K.; Christ, Angelika N.; Harliwong, Ivon; Bruxner, Timothy J.; Xu, Qinying; Anderson, Matthew; Wood, Scott; Leonard, Conrad; Taylor, Darrin; Newell, Felicity; Song, Sarah; Idrisoglu, Senel; Nourse, Craig; Nourbakhsh, Ehsan; Manning, Suzanne; Wani, Shivangi; Steptoe, Anita; Pajic, Marina; Cowley, Mark J.; Pinese, Mark; Chang, David K.; Gill, Anthony J.; Johns, Amber L.; Wu, Jianmin; Wilson, Peter J.; Fink, Lynn; Biankin, Andrew V.; Waddell, Nicola; Grimmond, Sean M.; Pearson, John V.
2013-01-01
Somatic mutation calling from next-generation sequencing data remains a challenge due to the difficulties of distinguishing true somatic events from artifacts arising from PCR, sequencing errors or mis-mapping. Tumor cellularity or purity, sub-clonality and copy number changes also confound the identification of true somatic events against a background of germline variants. We have developed a heuristic strategy and software (http://www.qcmg.org/bioinformatics/qsnp/) for somatic mutation calling in samples with low tumor content and we show the superior sensitivity and precision of our approach using a previously sequenced cell line, a series of tumor/normal admixtures, and 3,253 putative somatic SNVs verified on an orthogonal platform. PMID:24250782
Identifying biologically relevant differences between metagenomic communities.
Parks, Donovan H; Beiko, Robert G
2010-03-15
Metagenomics is the study of genetic material recovered directly from environmental samples. Taxonomic and functional differences between metagenomic samples can highlight the influence of ecological factors on patterns of microbial life in a wide range of habitats. Statistical hypothesis tests can help us distinguish ecological influences from sampling artifacts, but knowledge of only the P-value from a statistical hypothesis test is insufficient to make inferences about biological relevance. Current reporting practices for pairwise comparative metagenomics are inadequate, and better tools are needed for comparative metagenomic analysis. We have developed a new software package, STAMP, for comparative metagenomics that supports best practices in analysis and reporting. Examination of a pair of iron mine metagenomes demonstrates that deeper biological insights can be gained using statistical techniques available in our software. An analysis of the functional potential of 'Candidatus Accumulibacter phosphatis' in two enhanced biological phosphorus removal metagenomes identified several subsystems that differ between the A.phosphatis stains in these related communities, including phosphate metabolism, secretion and metal transport. Python source code and binaries are freely available from our website at http://kiwi.cs.dal.ca/Software/STAMP CONTACT: beiko@cs.dal.ca Supplementary data are available at Bioinformatics online.
PQLX: A seismic data quality control system description, applications, and users manual
McNamara, Daniel E.; Boaz, Richard I.
2011-01-01
We present a detailed description and users manual for a new tool to evaluate seismic station performance and characteristics by providing quick and easy transitions between visualizations of the frequency and time domains. The software is based on the probability density functions (PDF) of power spectral densities (PSD) (McNamara and Buland, 2004) and builds on the original development of the PDF stand-alone software system (McNamara and Boaz, 2005) and the seismological data viewer application PQL (IRIS-PASSCAL Quick Look) and PQLII (available through the IRIS PASSCAL program: http://www.passcal.nmt.edu/content/pql-ii-program-viewing-data). With PQLX (PQL eXtended), computed PSDs are stored in a MySQL database, allowing a user to access specific time periods of PSDs (PDF subsets) and time series segments through a GUI-driven interface. The power of the method and software lies in the fact that there is no need to screen the data for system transients, earthquakes, or general data artifacts, because they map into a background probability level. In fact, examination of artifacts related to station operation and episodic cultural noise allow us to estimate both the overall station quality and a baseline level of Earth noise at each site. The output of this analysis tool is useful for both operational and scientific applications. Operationally, it is useful for characterizing the current and past performance of existing broadband stations, for conducting tests on potential new seismic station locations, for evaluating station baseline noise levels (McNamara and others, 2009), for detecting problems with the recording system or sensors, and for evaluating the overall quality of data and metadata. Scientifically, the tool allows for mining of PSDs for investigations on the evolution of seismic noise (for example, Aster and others, 2008; and Aster and others, 2010) and other phenomena. Currently, PQLX is operational at several organizations including the USGS National Earthquake Information Center (NEIC), the USGS Albuquerque Seismological Laboratory (ASL), and the Incorporated Research Institutions in Seismology (IRIS) Data Management Center (DMC) for station monitoring and instrument response quality control. The PQLX system is available to the community at large through the U.S. Geological Survey (USGS) (http://ehpm-earthquake.wr.usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/pqlx). Also provided is a fully searchable website for bug reporting and enhancement requests (http://wush.net/bugzilla/PQLX). The first part of this document aims to describe and illustrate some of the features and capabilities of the software. The second part of this document is a detailed users manual that covers installation procedures, system requirements, operations, bug reporting, and software components (Appendix).
Automatic identification of artifacts in electrodermal activity data.
Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind
2015-01-01
Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.
Stone, David B.; Tamburro, Gabriella; Fiedler, Patrique; Haueisen, Jens; Comani, Silvia
2018-01-01
Data contamination due to physiological artifacts such as those generated by eyeblinks, eye movements, and muscle activity continues to be a central concern in the acquisition and analysis of electroencephalographic (EEG) data. This issue is further compounded in EEG sports science applications where the presence of artifacts is notoriously difficult to control because behaviors that generate these interferences are often the behaviors under investigation. Therefore, there is a need to develop effective and efficient methods to identify physiological artifacts in EEG recordings during sports applications so that they can be isolated from cerebral activity related to the activities of interest. We have developed an EEG artifact detection model, the Fingerprint Method, which identifies different spatial, temporal, spectral, and statistical features indicative of physiological artifacts and uses these features to automatically classify artifactual independent components in EEG based on a machine leaning approach. Here, we optimized our method using artifact-rich training data and a procedure to determine which features were best suited to identify eyeblinks, eye movements, and muscle artifacts. We then applied our model to an experimental dataset collected during endurance cycling. Results reveal that unique sets of features are suitable for the detection of distinct types of artifacts and that the Optimized Fingerprint Method was able to correctly identify over 90% of the artifactual components with physiological origin present in the experimental data. These results represent a significant advancement in the search for effective means to address artifact contamination in EEG sports science applications. PMID:29618975
Stone, David B; Tamburro, Gabriella; Fiedler, Patrique; Haueisen, Jens; Comani, Silvia
2018-01-01
Data contamination due to physiological artifacts such as those generated by eyeblinks, eye movements, and muscle activity continues to be a central concern in the acquisition and analysis of electroencephalographic (EEG) data. This issue is further compounded in EEG sports science applications where the presence of artifacts is notoriously difficult to control because behaviors that generate these interferences are often the behaviors under investigation. Therefore, there is a need to develop effective and efficient methods to identify physiological artifacts in EEG recordings during sports applications so that they can be isolated from cerebral activity related to the activities of interest. We have developed an EEG artifact detection model, the Fingerprint Method, which identifies different spatial, temporal, spectral, and statistical features indicative of physiological artifacts and uses these features to automatically classify artifactual independent components in EEG based on a machine leaning approach. Here, we optimized our method using artifact-rich training data and a procedure to determine which features were best suited to identify eyeblinks, eye movements, and muscle artifacts. We then applied our model to an experimental dataset collected during endurance cycling. Results reveal that unique sets of features are suitable for the detection of distinct types of artifacts and that the Optimized Fingerprint Method was able to correctly identify over 90% of the artifactual components with physiological origin present in the experimental data. These results represent a significant advancement in the search for effective means to address artifact contamination in EEG sports science applications.
A posteriori operation detection in evolving software models
Langer, Philip; Wimmer, Manuel; Brosch, Petra; Herrmannsdörfer, Markus; Seidl, Martina; Wieland, Konrad; Kappel, Gerti
2013-01-01
As every software artifact, also software models are subject to continuous evolution. The operations applied between two successive versions of a model are crucial for understanding its evolution. Generic approaches for detecting operations a posteriori identify atomic operations, but neglect composite operations, such as refactorings, which leads to cluttered difference reports. To tackle this limitation, we present an orthogonal extension of existing atomic operation detection approaches for detecting also composite operations. Our approach searches for occurrences of composite operations within a set of detected atomic operations in a post-processing manner. One major benefit is the reuse of specifications available for executing composite operations also for detecting applications of them. We evaluate the accuracy of the approach in a real-world case study and investigate the scalability of our implementation in an experiment. PMID:23471366
Keep Your Eye on the Ball: Investigating Artifacts-in-Use in Physical Education
ERIC Educational Resources Information Center
Quennerstedt, Mikael; Almqvist, Jonas; Ohman, Marie
2011-01-01
The purpose of this article is to develop a method of approach that can be used to explore the meaning and use of artifacts in education by applying a socio-cultural perspective to learning and artifacts. An empirical material of video recorded physical education lessons in Sweden is used to illustrate the approach in terms of how artifacts in…
NASA Astrophysics Data System (ADS)
Baytak, Ahmet
Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn, encouraged students to test and improve their designs. Sharing the games, it was found, has both positive and negative effects on the students' game design process and the representation of students' understandings of the domain subject.
Chen, Yang; Budde, Adam; Li, Ke; Li, Yinsheng; Hsieh, Jiang; Chen, Guang-Hong
2017-01-01
When the scan field of view (SFOV) of a CT system is not large enough to enclose the entire cross-section of the patient, or the patient needs to be positioned partially outside the SFOV for certain clinical applications, truncation artifacts often appear in the reconstructed CT images. Many truncation artifact correction methods perform extrapolations of the truncated projection data based on certain a priori assumptions. The purpose of this work was to develop a novel CT truncation artifact reduction method that directly operates on DICOM images. The blooming of pixel values associated with truncation was modeled using exponential decay functions, and based on this model, a discriminative dictionary was constructed to represent truncation artifacts and nonartifact image information in a mutually exclusive way. The discriminative dictionary consists of a truncation artifact subdictionary and a nonartifact subdictionary. The truncation artifact subdictionary contains 1000 atoms with different decay parameters, while the nonartifact subdictionary contains 1000 independent realizations of Gaussian white noise that are exclusive with the artifact features. By sparsely representing an artifact-contaminated CT image with this discriminative dictionary, the image was separated into a truncation artifact-dominated image and a complementary image with reduced truncation artifacts. The artifact-dominated image was then subtracted from the original image with an appropriate weighting coefficient to generate the final image with reduced artifacts. This proposed method was validated via physical phantom studies and retrospective human subject studies. Quantitative image evaluation metrics including the relative root-mean-square error (rRMSE) and the universal image quality index (UQI) were used to quantify the performance of the algorithm. For both phantom and human subject studies, truncation artifacts at the peripheral region of the SFOV were effectively reduced, revealing soft tissue and bony structure once buried in the truncation artifacts. For the phantom study, the proposed method reduced the relative RMSE from 15% (original images) to 11%, and improved the UQI from 0.34 to 0.80. A discriminative dictionary representation method was developed to mitigate CT truncation artifacts directly in the DICOM image domain. Both phantom and human subject studies demonstrated that the proposed method can effectively reduce truncation artifacts without access to projection data. © 2016 American Association of Physicists in Medicine.
Rodríguez-Olivares, Ramón; El Faquir, Nahid; Rahhab, Zouhair; Maugenest, Anne-Marie; Van Mieghem, Nicolas M; Schultz, Carl; Lauritsch, Guenter; de Jaegere, Peter P T
2016-07-01
To study the determinants of image quality of rotational angiography using dedicated research prototype software for motion compensation without rapid ventricular pacing after the implantation of four commercially available catheter-based valves. Prospective observational study including 179 consecutive patients who underwent transcatheter aortic valve implantation (TAVI) with either the Medtronic CoreValve (MCS), Edward-SAPIEN Valve (ESV), Boston Sadra Lotus (BSL) or Saint-Jude Portico Valve (SJP) in whom rotational angiography (R-angio) with motion compensation 3D image reconstruction was performed. Image quality was evaluated from grade 1 (excellent image quality) to grade 5 (strongly degraded). Distinction was made between good (grades 1, 2) and poor image quality (grades 3-5). Clinical (gender, body mass index, Agatston score, heart rate and rhythm, artifacts), procedural (valve type) and technical variables (isocentricity) were related with the image quality assessment. Image quality was good in 128 (72 %) and poor in 51 (28 %) patients. By univariable analysis only valve type (BSL) and the presence of an artefact negatively affected image quality. By multivariate analysis (in which BMI was forced into the model) BSL valve (Odds 3.5, 95 % CI [1.3-9.6], p = 0.02), presence of an artifact (Odds 2.5, 95 % CI [1.2-5.4], p = 0.02) and BMI (Odds 1.1, 95 % CI [1.0-1.2], p = 0.04) were independent predictors of poor image quality. Rotational angiography with motion compensation 3D image reconstruction using a dedicated research prototype software offers good image quality for the evaluation of frame geometry after TAVI in the majority of patients. Valve type, presence of artifacts and higher BMI negatively affect image quality.
Correction of Bowtie-Filter Normalization and Crescent Artifacts for a Clinical CBCT System.
Zhang, Hong; Kong, Vic; Huang, Ke; Jin, Jian-Yue
2017-02-01
To present our experiences in understanding and minimizing bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a clinical cone beam computed tomography system. Bowtie-filter position and profile variations during gantry rotation were studied. Two previously proposed strategies (A and B) were applied to the clinical cone beam computed tomography system to correct bowtie-filter crescent artifacts. Physical calibration and analytical approaches were used to minimize the norm phantom misalignment and to correct for bowtie-filter normalization artifacts. A combined procedure to reduce bowtie-filter crescent artifacts and bowtie-filter normalization artifacts was proposed and tested on a norm phantom, CatPhan, and a patient and evaluated using standard deviation of Hounsfield unit along a sampling line. The bowtie-filter exhibited not only a translational shift but also an amplitude variation in its projection profile during gantry rotation. Strategy B was better than strategy A slightly in minimizing bowtie-filter crescent artifacts, possibly because it corrected the amplitude variation, suggesting that the amplitude variation plays a role in bowtie-filter crescent artifacts. The physical calibration largely reduced the misalignment-induced bowtie-filter normalization artifacts, and the analytical approach further reduced bowtie-filter normalization artifacts. The combined procedure minimized both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts, with Hounsfield unit standard deviation being 63.2, 45.0, 35.0, and 18.8 Hounsfield unit for the best correction approaches of none, bowtie-filter crescent artifacts, bowtie-filter normalization artifacts, and bowtie-filter normalization artifacts + bowtie-filter crescent artifacts, respectively. The combined procedure also demonstrated reduction of bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a CatPhan and a patient. We have developed a step-by-step procedure that can be directly used in clinical cone beam computed tomography systems to minimize both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts.
Lawhern, Vernon; Hairston, W David; McDowell, Kaleb; Westerfield, Marissa; Robbins, Kay
2012-07-15
We examine the problem of accurate detection and classification of artifacts in continuous EEG recordings. Manual identification of artifacts, by means of an expert or panel of experts, can be tedious, time-consuming and infeasible for large datasets. We use autoregressive (AR) models for feature extraction and characterization of EEG signals containing several kinds of subject-generated artifacts. AR model parameters are scale-invariant features that can be used to develop models of artifacts across a population. We use a support vector machine (SVM) classifier to discriminate among artifact conditions using the AR model parameters as features. Results indicate reliable classification among several different artifact conditions across subjects (approximately 94%). These results suggest that AR modeling can be a useful tool for discriminating among artifact signals both within and across individuals. Copyright © 2012 Elsevier B.V. All rights reserved.
Landsat TM memory effect characterization and correction
Helder, D.; Boncyk, W.; Morfitt, R.
1997-01-01
Before radiometric calibration of Landsat Thematic Mapper (TM) data can be done accurately, it is necessary to minimize the effects of artifacts present in the data that originate in the instrument's signal processing path. These artifacts have been observed in downlinked image data since shortly after launch of Landsat 4 and 5. However, no comprehensive work has been done to characterize all the artifacts and develop methods for their correction. In this paper, the most problematic artifact is discussed: memory effect (ME). Characterization of this artifact is presented, including the parameters necessary for its correction. In addition, a correction algorithm is described that removes the artifact from TM imagery. It will be shown that this artifact causes significant radiometry errors, but the effect can be removed in a straightforward manner.
Operational Based Vision Assessment
2014-02-01
formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation or convey any...expensive than other developers’ software. The sources for the GPUs ( Nvidia ) and the host computer (Concurrent’s iHawk) were identified. The...boundaries, which is a distracting artifact when performing visual tests. The problem has been isolated by the OBVA team to the Nvidia GPUs. The OBVA system
Liquid argon TPC signal formation, signal processing and reconstruction techniques
NASA Astrophysics Data System (ADS)
Baller, B.
2017-07-01
This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, J; Kerns, J; Nute, J
Purpose: To evaluate three commercial metal artifact reduction methods (MAR) in the context of radiation therapy treatment planning. Methods: Three MAR strategies were evaluated: Philips O-MAR, monochromatic imaging using Gemstone Spectral Imaging (GSI) dual energy CT, and monochromatic imaging with metal artifact reduction software (GSIMARs). The Gammex RMI 467 tissue characterization phantom with several metal rods and two anthropomorphic phantoms (pelvic phantom with hip prosthesis and head phantom with dental fillings), were scanned with and without (baseline) metals. Each MAR method was evaluated based on CT number accuracy, metal size accuracy, and reduction in the severity of streak artifacts. CTmore » number difference maps between the baseline and metal scan images were calculated, and the severity of streak artifacts was quantified using the percentage of pixels with >40 HU error (“bad pixels”). Results: Philips O-MAR generally reduced HU errors in the RMI phantom. However, increased errors and induced artifacts were observed for lung materials. GSI monochromatic 70keV images generally showed similar HU errors as 120kVp imaging, while 140keV images reduced errors. GSI-MARs systematically reduced errors compared to GSI monochromatic imaging. All imaging techniques preserved the diameter of a stainless steel rod to within ±1.6mm (2 pixels). For the hip prosthesis, O-MAR reduced the average % bad pixels from 47% to 32%. For GSI 140keV imaging, the percent of bad pixels was reduced from 37% to 29% compared to 120kVp imaging, while GSI-MARs further reduced it to 12%. For the head phantom, none of the MAR methods were particularly successful. Conclusion: The three MAR methods all improve CT images for treatment planning to some degree, but none of them are globally effective for all conditions. The MAR methods were successful for large metal implants in a homogeneous environment (hip prosthesis) but were not successful for the more complicated case of dental artifacts.« less
Innovation Balanced with Community Collaboration, ESIP
NASA Astrophysics Data System (ADS)
White, C. E.
2016-12-01
Representing the Federation of Earth Science Information Partners (ESIP), I'll speak to how the organization supports a diverse community of science, data and information technology practitioners to foster innovation balanced with community collaboration on the why and how. ESIP builds connections among organizations, sectors, disciplines, systems and data so participants can leverage their collective expertise and technical capacity to address common challenges. This work improves Earth science data management practices and makes Earth science data more discoverable, accessible and useful to researchers, policy makers and the public. Greater than ever is the desire for guidelines in software/code development, evaluation of technology and its artifacts, and community validation of products and practices. ESIP's mechanisms for evaluation and assessment range from informal to formal, with opportunities for all.
ERIC Educational Resources Information Center
Spektor-Precel, Karen; Mioduser, David
2015-01-01
Nowadays, we are surrounded by artifacts that are capable of adaptive behavior, such as electric pots, boiler timers, automatic doors, and robots. The literature concerning human beings' conceptions of "traditional" artifacts is vast, however, little is known about our conceptions of behaving artifacts, nor of the influence of the…
Mangold, Stefanie; Gatidis, Sergios; Luz, Oliver; König, Benjamin; Schabel, Christoph; Bongers, Malte N; Flohr, Thomas G; Claussen, Claus D; Thomas, Christoph
2014-12-01
The objective of this study was to retrospectively determine the potential of virtual monoenergetic (ME) reconstructions for a reduction of metal artifacts using a new-generation single-source computed tomographic (CT) scanner. The ethics committee of our institution approved this retrospective study with a waiver of the need for informed consent. A total of 50 consecutive patients (29 men and 21 women; mean [SD] age, 51.3 [16.7] years) with metal implants after osteosynthetic fracture treatment who had been examined using a single-source CT scanner (SOMATOM Definition Edge; Siemens Healthcare, Forchheim, Germany; consecutive dual-energy mode with 140 kV/80 kV) were selected. Using commercially available postprocessing software (syngo Dual Energy; Siemens AG), virtual ME data sets with extrapolated energy of 130 keV were generated (medium smooth convolution kernel D30) and compared with standard polyenergetic images reconstructed with a B30 (medium smooth) and a B70 (sharp) kernel. For quantification of the beam hardening artifacts, CT values were measured on circular lines surrounding bone and the osteosynthetic device, and frequency analyses of these values were performed using discrete Fourier transform. A high proportion of low frequencies to the spectrum indicates a high level of metal artifacts. The measurements in all data sets were compared using the Wilcoxon signed rank test. The virtual ME images with extrapolated energy of 130 keV showed significantly lower contribution of low frequencies after the Fourier transform compared with any polyenergetic data set reconstructed with D30, B70, and B30 kernels (P < 0.001). Sequential single-source dual-energy CT allows an efficient reduction of metal artifacts using high-energy ME extrapolation after osteosynthetic fracture treatment.
Naturalistic Experience and the Early Use of Symbolic Artifacts
ERIC Educational Resources Information Center
Troseth, Georgene L.; Casey, Amy M.; Lawver, Kelly A.; Walker, Joan M. T.; Cole, David A.
2007-01-01
Experience with a variety of symbolic artifacts has been proposed as a mechanism underlying symbolic development. In this study, the parents of 120 2-year-old children who participated in symbolic object retrieval tasks completed a questionnaire regarding their children's naturalistic experience with symbolic artifacts and activities. In separate…
Young Children's Rapid Learning about Artifacts
ERIC Educational Resources Information Center
Casler, Krista; Kelemen, Deborah
2005-01-01
Tool use is central to interdisciplinary debates about the evolution and distinctiveness of human intelligence, yet little is actually known about how human conceptions of artifacts develop. Results across these two studies show that even 2-year-olds approach artifacts in ways distinct from captive tool-using monkeys. Contrary to adult intuition,…
BioSig: The Free and Open Source Software Library for Biomedical Signal Processing
Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois
2011-01-01
BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227
Fuller, Clifton D; Diaz, Irma; Cavanaugh, Sean X; Eng, Tony Y
2004-07-01
A patient with base of tongue squamous sell carcinoma, with significant CT artifact-inducing metallic alloy, non-removable dental restorations in both the mandible and maxilla was identified. Simultaneous with IMRT treatment, thermoluminescent dosimeters (TLDs) were placed in the oral cavity. After a series of three treatments, the data from the TLDs and software calculations were analyzed. Analysis of mean in vivo TLD dosimetry reveals differentials from software predicted dose calculation that fall within acceptable dose variation limits. IMRT dose calculation software is a relatively accurate predictor of dose attenuation and augmentation due to dental alloys within the treatment volume, as measured by intra-oral thermoluminescent dosimetry. IMRT represents a safe and effective methodology to treat patients with non-removable metallic dental work who have head and neck cancer.
BioSig: the free and open source software library for biomedical signal processing.
Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois
2011-01-01
BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.
A Novel Stimulus Artifact Removal Technique for High-Rate Electrical Stimulation
Heffer, Leon F; Fallon, James B
2008-01-01
Electrical stimulus artifact corrupting electrophysiological recordings often make the subsequent analysis of the underlying neural response difficult. This is particularly evident when investigating short-latency neural activity in response to high-rate electrical stimulation. We developed and evaluated an off-line technique for the removal of stimulus artifact from electrophysiological recordings. Pulsatile electrical stimulation was presented at rates of up to 5000 pulses/s during extracellular recordings of guinea pig auditory nerve fibers. Stimulus artifact was removed by replacing the sample points at each stimulus artifact event with values interpolated along a straight line, computed from neighbouring sample points. This technique required only that artifact events be identifiable and that the artifact duration remained less than both the inter-stimulus interval and the time course of the action potential. We have demonstrated that this computationally efficient sample-and-interpolate technique removes the stimulus artifact with minimal distortion of the action potential waveform. We suggest that this technique may have potential applications in a range of electrophysiological recording systems. PMID:18339428
A generic EEG artifact removal algorithm based on the multi-channel Wiener filter
NASA Astrophysics Data System (ADS)
Somers, Ben; Francart, Tom; Bertrand, Alexander
2018-06-01
Objective. The electroencephalogram (EEG) is an essential neuro-monitoring tool for both clinical and research purposes, but is susceptible to a wide variety of undesired artifacts. Removal of these artifacts is often done using blind source separation techniques, relying on a purely data-driven transformation, which may sometimes fail to sufficiently isolate artifacts in only one or a few components. Furthermore, some algorithms perform well for specific artifacts, but not for others. In this paper, we aim to develop a generic EEG artifact removal algorithm, which allows the user to annotate a few artifact segments in the EEG recordings to inform the algorithm. Approach. We propose an algorithm based on the multi-channel Wiener filter (MWF), in which the artifact covariance matrix is replaced by a low-rank approximation based on the generalized eigenvalue decomposition. The algorithm is validated using both hybrid and real EEG data, and is compared to other algorithms frequently used for artifact removal. Main results. The MWF-based algorithm successfully removes a wide variety of artifacts with better performance than current state-of-the-art methods. Significance. Current EEG artifact removal techniques often have limited applicability due to their specificity to one kind of artifact, their complexity, or simply because they are too ‘blind’. This paper demonstrates a fast, robust and generic algorithm for removal of EEG artifacts of various types, i.e. those that were annotated as unwanted by the user.
Optical measurement of blood flow in exercising skeletal muscle: a pilot study
NASA Astrophysics Data System (ADS)
Wang, Detian; Baker, Wesley B.; Parthasarathy, Ashwin B.; Zhu, Liguo; Li, Zeren; Yodh, Arjun G.
2017-07-01
Blood flow monitoring during rhythm exercising is very important for sports medicine and muscle dieases. Diffuse correlation spectroscopy(DCS) is a relative new invasive way to monitor blood flow but suffering from muscle fiber motion. In this study we focus on how to remove exercise driven artifacts and obtain accurate estimates of the increase in blood flow from exercise. Using a novel fast software correlator, we measured blood flow in forearm flexor muscles of N=2 healthy adults during handgrip exercise, at a sampling rate of 20 Hz. Combining the blood flow and acceleration data, we resolved the motion artifact in the DCS signal induced by muscle fiber motion, and isolated the blood flow component of the signal from the motion artifact. The results show that muscle fiber motion strongly affects the DCS signal, and if not accounted for, will result in an overestimate of blood flow more than 1000%. Our measurements indicate rapid dilation of arterioles following exercise onset, which enabled blood flow to increase to a plateau of 200% in 10s. The blood flow also rapidly recovered to baseline following exercise in 10s. Finally, preliminary results on the dependence of blood flow from exercise intensity changes will be discussed.
Clinical introduction of image lag correction for a cone beam CT system.
Stankovic, Uros; Ploeger, Lennert S; Sonke, Jan-Jakob; van Herk, Marcel
2016-03-01
Image lag in the flat-panel detector used for Linac integrated cone beam computed tomography (CBCT) has a degrading effect on CBCT image quality. The most prominent visible artifact is the presence of bright semicircular structure in the transverse view of the scans, known also as radar artifact. Several correction strategies have been proposed, but until now the clinical introduction of such corrections remains unreported. In November 2013, the authors have clinically implemented a previously proposed image lag correction on all of their machines at their main site in Amsterdam. The purpose of this study was to retrospectively evaluate the effect of the correction on the quality of CBCT images and evaluate the required calibration frequency. Image lag was measured in five clinical CBCT systems (Elekta Synergy 4.6) using an in-house developed beam interrupting device that stops the x-ray beam midway through the data acquisition of an unattenuated beam for calibration. A triple exponential falling edge response was fitted to the measured data and used to correct image lag from projection images with an infinite response. This filter, including an extrapolation for saturated pixels, was incorporated in the authors' in-house developed clinical cbct reconstruction software. To investigate the short-term stability of the lag and associated parameters, a series of five image lag measurement over a period of three months was performed. For quantitative analysis, the authors have retrospectively selected ten patients treated in the pelvic region. The apparent contrast was quantified in polar coordinates for scans reconstructed using the parameters obtained from different dates with and without saturation handling. Visually, the radar artifact was minimal in scans reconstructed using image lag correction especially when saturation handling was used. In patient imaging, there was a significant reduction of the apparent contrast from 43 ± 16.7 to 15.5 ± 11.9 HU without the saturation handling and to 9.6 ± 12.1 HU with the saturation handling, depending on the date of the calibration. The image lag correction parameters were stable over a period of 3 months. The computational load was increased by approximately 10%, not endangering the fast in-line reconstruction. The lag correction was successfully implemented clinically and removed most image lag artifacts thus improving the image quality. Image lag correction parameters were stable for 3 months indicating low frequency of calibration requirements.
An evolution of image source camera attribution approaches.
Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul
2016-05-01
Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics researchers, are also critically analysed and further categorised into four different classes, namely, optical aberrations based, sensor camera fingerprints based, processing statistics based and processing regularities based, to present a classification. Furthermore, this paper aims to investigate the challenging problems, and the proposed strategies of such schemes based on the suggested taxonomy to plot an evolution of the source camera attribution approaches with respect to the subjective optimisation criteria over the last decade. The optimisation criteria were determined based on the strategies proposed to increase the detection accuracy, robustness and computational efficiency of source camera brand, model or device attribution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Holcomb, C L; Rastrou, M; Williams, T C; Goodridge, D; Lazaro, A M; Tilanus, M; Erlich, H A
2014-01-01
The high-resolution human leukocyte antigen (HLA) genotyping assay that we developed using 454 sequencing and Conexio software uses generic polymerase chain reaction (PCR) primers for DRB exon 2. Occasionally, we observed low abundance DRB amplicon sequences that resulted from in vitro PCR 'crossing over' between DRB1 and DRB3/4/5. These hybrid sequences, revealed by the clonal sequencing property of the 454 system, were generally observed at a read depth of 5%-10% of the true alleles. They usually contained at least one mismatch with the IMGT/HLA database, and consequently, were easily recognizable and did not cause a problem for HLA genotyping. Sometimes, however, these artifactual sequences matched a rare allele and the automatic genotype assignment was incorrect. These observations raised two issues: (1) could PCR conditions be modified to reduce such artifacts? and (2) could some of the rare alleles listed in the IMGT/HLA database be artifacts rather than true alleles? Because PCR crossing over occurs during late cycles of PCR, we compared DRB genotypes resulting from 28 and (our standard) 35 cycles of PCR. For all 21 cell line DNAs amplified for 35 cycles, crossover products were detected. In 33% of the cases, these hybrid sequences corresponded to named alleles. With amplification for only 28 cycles, these artifactual sequences were not detectable. To investigate whether some rare alleles in the IMGT/HLA database might be due to PCR artifacts, we analyzed four samples obtained from the investigators who submitted the sequences. In three cases, the sequences were generated from true alleles. In one case, our 454 sequencing revealed an error in the previously submitted sequence. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Children's Classifications of Nature and Artifact Pictures into Female and Male Categories.
ERIC Educational Resources Information Center
Mullen, Mary K.
1990-01-01
Twenty-two second grade students in Cambridge (Massachusetts) were asked to rate pictures of nature items and artifacts on a seven-point scale ranging from "maleness" to "femaleness." Responses overall showed a female-nature and male-artifact association. Discusses the role of such associations in the development of gender…
NASA Astrophysics Data System (ADS)
Kim, Juhye; Nam, Haewon; Lee, Rena
2015-07-01
CT (computed tomography) images, metal materials such as tooth supplements or surgical clips can cause metal artifact and degrade image quality. In severe cases, this may lead to misdiagnosis. In this research, we developed a new MAR (metal artifact reduction) algorithm by using an edge preserving filter and the MATLAB program (Mathworks, version R2012a). The proposed algorithm consists of 6 steps: image reconstruction from projection data, metal segmentation, forward projection, interpolation, applied edge preserving smoothing filter, and new image reconstruction. For an evaluation of the proposed algorithm, we obtained both numerical simulation data and data for a Rando phantom. In the numerical simulation data, four metal regions were added into the Shepp Logan phantom for metal artifacts. The projection data of the metal-inserted Rando phantom were obtained by using a prototype CBCT scanner manufactured by medical engineering and medical physics (MEMP) laboratory research group in medical science at Ewha Womans University. After these had been adopted the proposed algorithm was performed, and the result were compared with the original image (with metal artifact without correction) and with a corrected image based on linear interpolation. Both visual and quantitative evaluations were done. Compared with the original image with metal artifacts and with the image corrected by using linear interpolation, both the numerical and the experimental phantom data demonstrated that the proposed algorithm reduced the metal artifact. In conclusion, the evaluation in this research showed that the proposed algorithm outperformed the interpolation based MAR algorithm. If an optimization and a stability evaluation of the proposed algorithm can be performed, the developed algorithm is expected to be an effective tool for eliminating metal artifacts even in commercial CT systems.
Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model
NASA Astrophysics Data System (ADS)
Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman
2016-08-01
e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers
A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data
NASA Astrophysics Data System (ADS)
Meek, Sam; Jackson, Mike; Leibovici, Didier G.
2016-02-01
The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.
MorphoGraphX: A platform for quantifying morphogenesis in 4D
Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne HK; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S
2015-01-01
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX (www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth. DOI: http://dx.doi.org/10.7554/eLife.05864.001 PMID:25946108
Quality assurance in mammography: artifact analysis.
Hogge, J P; Palmer, C H; Muller, C C; Little, S T; Smith, D C; Fatouros, P P; de Paredes, E S
1999-01-01
Evaluation of mammograms for artifacts is essential for mammographic quality assurance. A variety of mammographic artifacts (i.e., variations in mammographic density not caused by true attenuation differences) can occur and can create pseudolesions or mask true abnormalities. Many artifacts are readily identified, whereas others present a true diagnostic challenge. Factors that create artifacts may be related to the processor (eg, static, dirt or excessive developer buildup on the rollers, excessive roller pressure, damp film, scrapes and scratches, incomplete fixing, power failure, contaminated developer), the technologist (eg, improper film handling and loading, improper use of the mammography unit and related equipment, positioning and darkroom errors), the mammography unit (eg, failure of the collimation mirror to rotate, grid inhomogeneity, failure of the reciprocating grid to move, material in the tube housing, compression failure, improper alignment of the compression paddle with the Bucky tray, defective compression paddle), or the patient (e.g., motion, superimposed objects or substances [jewelry, body parts, clothing, hair, implanted medical devices, foreign bodies, substances on the skin]). Familiarity with the broad range of artifacts and the measures required to eliminate them is vital. Careful attention to darkroom cleanliness, care in film handling, regularly scheduled processor maintenance and chemical replenishment, daily quality assurance activities, and careful attention to detail during patient positioning and mammography can reduce or eliminate most mammographic artifacts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stankovic, Uros; Herk, Marcel van; Ploeger, Lennert S.
Purpose: Medical linear accelerator mounted cone beam CT (CBCT) scanner provides useful soft tissue contrast for purposes of image guidance in radiotherapy. The presence of extensive scattered radiation has a negative effect on soft tissue visibility and uniformity of CBCT scans. Antiscatter grids (ASG) are used in the field of diagnostic radiography to mitigate the scatter. They usually do increase the contrast of the scan, but simultaneously increase the noise. Therefore, and considering other scatter mitigation mechanisms present in a CBCT scanner, the applicability of ASGs with aluminum interspacing for a wide range of imaging conditions has been inconclusive inmore » previous studies. In recent years, grids using fiber interspacers have appeared, providing grids with higher scatter rejection while maintaining reasonable transmission of primary radiation. The purpose of this study was to evaluate the impact of one such grid on CBCT image quality. Methods: The grid used (Philips Medical Systems) had ratio of 21:1, frequency 36 lp/cm, and nominal selectivity of 11.9. It was mounted on the kV flat panel detector of an Elekta Synergy linear accelerator and tested in a phantom and a clinical study. Due to the flex of the linac and presence of gridline artifacts an angle dependent gain correction algorithm was devised to mitigate resulting artifacts. Scan reconstruction was performed using XVI4.5 augmented with inhouse developed image lag correction and Hounsfield unit calibration. To determine the necessary parameters for Hounsfield unit calibration and software scatter correction parameters, the Catphan 600 (The Phantom Laboratory) phantom was used. Image quality parameters were evaluated using CIRS CBCT Image Quality and Electron Density Phantom (CIRS) in two different geometries: one modeling head and neck and other pelvic region. Phantoms were acquired with and without the grid and reconstructed with and without software correction which was adapted for the different acquisition scenarios. Parameters used in the phantom study weret{sub cup} for nonuniformity and contrast-to-noise ratio (CNR) for soft tissue visibility. Clinical scans were evaluated in an observer study in which four experienced radiotherapy technologists rated soft tissue visibility and uniformity of scans with and without the grid. Results: The proposed angle dependent gain correction algorithm suppressed the visible ring artifacts. Grid had a beneficial impact on nonuniformity, contrast to noise ratio, and Hounsfield unit accuracy for both scanning geometries. The nonuniformity reduced by 90% for head sized object and 91% for pelvic-sized object. CNR improved compared to no corrections on average by a factor 2.8 for the head sized object, and 2.2 for the pelvic sized phantom. Grid outperformed software correction alone, but adding additional software correction to the grid was overall the best strategy. In the observer study, a significant improvement was found in both soft tissue visibility and nonuniformity of scans when grid is used. Conclusions: The evaluated fiber-interspaced grid improved the image quality of the CBCT system for broad range of imaging conditions. Clinical scans show significant improvement in soft tissue visibility and uniformity without the need to increase the imaging dose.« less
Stankovic, Uros; van Herk, Marcel; Ploeger, Lennert S; Sonke, Jan-Jakob
2014-06-01
Medical linear accelerator mounted cone beam CT (CBCT) scanner provides useful soft tissue contrast for purposes of image guidance in radiotherapy. The presence of extensive scattered radiation has a negative effect on soft tissue visibility and uniformity of CBCT scans. Antiscatter grids (ASG) are used in the field of diagnostic radiography to mitigate the scatter. They usually do increase the contrast of the scan, but simultaneously increase the noise. Therefore, and considering other scatter mitigation mechanisms present in a CBCT scanner, the applicability of ASGs with aluminum interspacing for a wide range of imaging conditions has been inconclusive in previous studies. In recent years, grids using fiber interspacers have appeared, providing grids with higher scatter rejection while maintaining reasonable transmission of primary radiation. The purpose of this study was to evaluate the impact of one such grid on CBCT image quality. The grid used (Philips Medical Systems) had ratio of 21:1, frequency 36 lp/cm, and nominal selectivity of 11.9. It was mounted on the kV flat panel detector of an Elekta Synergy linear accelerator and tested in a phantom and a clinical study. Due to the flex of the linac and presence of gridline artifacts an angle dependent gain correction algorithm was devised to mitigate resulting artifacts. Scan reconstruction was performed using XVI4.5 augmented with inhouse developed image lag correction and Hounsfield unit calibration. To determine the necessary parameters for Hounsfield unit calibration and software scatter correction parameters, the Catphan 600 (The Phantom Laboratory) phantom was used. Image quality parameters were evaluated using CIRS CBCT Image Quality and Electron Density Phantom (CIRS) in two different geometries: one modeling head and neck and other pelvic region. Phantoms were acquired with and without the grid and reconstructed with and without software correction which was adapted for the different acquisition scenarios. Parameters used in the phantom study were t(cup) for nonuniformity and contrast-to-noise ratio (CNR) for soft tissue visibility. Clinical scans were evaluated in an observer study in which four experienced radiotherapy technologists rated soft tissue visibility and uniformity of scans with and without the grid. The proposed angle dependent gain correction algorithm suppressed the visible ring artifacts. Grid had a beneficial impact on nonuniformity, contrast to noise ratio, and Hounsfield unit accuracy for both scanning geometries. The nonuniformity reduced by 90% for head sized object and 91% for pelvic-sized object. CNR improved compared to no corrections on average by a factor 2.8 for the head sized object, and 2.2 for the pelvic sized phantom. Grid outperformed software correction alone, but adding additional software correction to the grid was overall the best strategy. In the observer study, a significant improvement was found in both soft tissue visibility and nonuniformity of scans when grid is used. The evaluated fiber-interspaced grid improved the image quality of the CBCT system for broad range of imaging conditions. Clinical scans show significant improvement in soft tissue visibility and uniformity without the need to increase the imaging dose.
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Voting strategy for artifact reduction in digital breast tomosynthesis.
Wu, Tao; Moore, Richard H; Kopans, Daniel B
2006-07-01
Artifacts are observed in digital breast tomosynthesis (DBT) reconstructions due to the small number of projections and the narrow angular range that are typically employed in tomosynthesis imaging. In this work, we investigate the reconstruction artifacts that are caused by high-attenuation features in breast and develop several artifact reduction methods based on a "voting strategy." The voting strategy identifies the projection(s) that would introduce artifacts to a voxel and rejects the projection(s) when reconstructing the voxel. Four approaches to the voting strategy were compared, including projection segmentation, maximum contribution deduction, one-step classification, and iterative classification. The projection segmentation method, based on segmentation of high-attenuation features from the projections, effectively reduces artifacts caused by metal and large calcifications that can be reliably detected and segmented from projections. The other three methods are based on the observation that contributions from artifact-inducing projections have higher value than those from normal projections. These methods attempt to identify the projection(s) that would cause artifacts by comparing contributions from different projections. Among the three methods, the iterative classification method provides the best artifact reduction; however, it can generate many false positive classifications that degrade the image quality. The maximum contribution deduction method and one-step classification method both reduce artifacts well from small calcifications, although the performance of artifact reduction is slightly better with the one-step classification. The combination of one-step classification and projection segmentation removes artifacts from both large and small calcifications.
Holdsworth, Samantha J; Aksoy, Murat; Newbould, Rexford D; Yeom, Kristen; Van, Anh T; Ooi, Melvyn B; Barnes, Patrick D; Bammer, Roland; Skare, Stefan
2012-10-01
To develop and implement a clinical DTI technique suitable for the pediatric setting that retrospectively corrects for large motion without the need for rescanning and/or reacquisition strategies, and to deliver high-quality DTI images (both in the presence and absence of large motion) using procedures that reduce image noise and artifacts. We implemented an in-house built generalized autocalibrating partially parallel acquisitions (GRAPPA)-accelerated diffusion tensor (DT) echo-planar imaging (EPI) sequence at 1.5T and 3T on 1600 patients between 1 month and 18 years old. To reconstruct the data, we developed a fully automated tailored reconstruction software that selects the best GRAPPA and ghost calibration weights; does 3D rigid-body realignment with importance weighting; and employs phase correction and complex averaging to lower Rician noise and reduce phase artifacts. For select cases we investigated the use of an additional volume rejection criterion and b-matrix correction for large motion. The DTI image reconstruction procedures developed here were extremely robust in correcting for motion, failing on only three subjects, while providing the radiologists high-quality data for routine evaluation. This work suggests that, apart from the rare instance of continuous motion throughout the scan, high-quality DTI brain data can be acquired using our proposed integrated sequence and reconstruction that uses a retrospective approach to motion correction. In addition, we demonstrate a substantial improvement in overall image quality by combining phase correction with complex averaging, which reduces the Rician noise that biases noisy data. Copyright © 2012 Wiley Periodicals, Inc.
Reference-Free Removal of EEG-fMRI Ballistocardiogram Artifacts with Harmonic Regression
Krishnaswamy, Pavitra; Bonmassar, Giorgio; Poulsen, Catherine; Pierce, Eric T; Purdon, Patrick L.; Brown, Emery N.
2016-01-01
Combining electroencephalogram (EEG) recording and functional magnetic resonance imaging (fMRI) offers the potential for imaging brain activity with high spatial and temporal resolution. This potential remains limited by the significant ballistocardiogram (BCG) artifacts induced in the EEG by cardiac pulsation-related head movement within the magnetic field. We model the BCG artifact using a harmonic basis, pose the artifact removal problem as a local harmonic regression analysis, and develop an efficient maximum likelihood algorithm to estimate and remove BCG artifacts. Our analysis paradigm accounts for time-frequency overlap between the BCG artifacts and neurophysiologic EEG signals, and tracks the spatiotemporal variations in both the artifact and the signal. We evaluate performance on: simulated oscillatory and evoked responses constructed with realistic artifacts; actual anesthesia-induced oscillatory recordings; and actual visual evoked potential recordings. In each case, the local harmonic regression analysis effectively removes the BCG artifacts, and recovers the neurophysiologic EEG signals. We further show that our algorithm outperforms commonly used reference-based and component analysis techniques, particularly in low SNR conditions, the presence of significant time-frequency overlap between the artifact and the signal, and/or large spatiotemporal variations in the BCG. Because our algorithm does not require reference signals and has low computational complexity, it offers a practical tool for removing BCG artifacts from EEG data recorded in combination with fMRI. PMID:26151100
Chu, Mei-Lan; Chang, Hing-Chiu; Chung, Hsiao-Wen; Truong, Trong-Kha; Bashir, Mustafa R.; Chen, Nan-kuei
2014-01-01
Purpose A projection onto convex sets reconstruction of multiplexed sensitivity encoded MRI (POCSMUSE) is developed to reduce motion-related artifacts, including respiration artifacts in abdominal imaging and aliasing artifacts in interleaved diffusion weighted imaging (DWI). Theory Images with reduced artifacts are reconstructed with an iterative POCS procedure that uses the coil sensitivity profile as a constraint. This method can be applied to data obtained with different pulse sequences and k-space trajectories. In addition, various constraints can be incorporated to stabilize the reconstruction of ill-conditioned matrices. Methods The POCSMUSE technique was applied to abdominal fast spin-echo imaging data, and its effectiveness in respiratory-triggered scans was evaluated. The POCSMUSE method was also applied to reduce aliasing artifacts due to shot-to-shot phase variations in interleaved DWI data corresponding to different k-space trajectories and matrix condition numbers. Results Experimental results show that the POCSMUSE technique can effectively reduce motion-related artifacts in data obtained with different pulse sequences, k-space trajectories and contrasts. Conclusion POCSMUSE is a general post-processing algorithm for reduction of motion-related artifacts. It is compatible with different pulse sequences, and can also be used to further reduce residual artifacts in data produced by existing motion artifact reduction methods. PMID:25394325
Adjustable shunt valve-induced magnetic resonance imaging artifact: a comparative study.
Toma, Ahmed K; Tarnaris, Andrew; Grieve, Joan P; Watkins, Laurence D; Kitchen, Neil D
2010-07-01
In this paper, the authors' goal was to compare the artifact induced by implanted (in vivo) adjustable shunt valves in spin echo, diffusion weighted (DW), and gradient echo MR imaging pulse sequences. The MR images obtained in 8 patients with proGAV and 6 patients with Strata II adjustable shunt valves were assessed for artifact areas in different planes as well as the total volume for different pulse sequences. Artifacts induced by the Strata II valve were significantly larger than those induced by proGAV valve in spin echo MR imaging pulse sequence (29,761 vs 2450 mm(3) on T2-weighted fast spin echo, p = 0.003) and DW images (100,138 vs 38,955 mm(3), p = 0.025). Artifacts were more marked on DW MR images than on spin echo pulse sequence for both valve types. Adjustable valve-induced artifacts can conceal brain pathology on MR images. This should influence the choice of valve implantation site and the type of valve used. The effect of artifacts on DW images should be highlighted pending the development of less MR imaging artifact-inducing adjustable shunt valves.
Using Classroom Artifacts to Measure the Efficacy of a Professional Development. CRESST Report 761
ERIC Educational Resources Information Center
Silk, Yael; Silver, David; Amerian, Stephanie; Nishimura, Claire; Boscardin, Christy Kim
2009-01-01
This report describes a classroom artifact measure and presents early findings from an efficacy study of WestEd's Reading Apprenticeship (RA) professional development program. The professional development is designed to teach high school teachers how to integrate subject-specific literacy instruction into their regular curricula. The current RA…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, N; Young, L; Parvathaneni, U
Purpose: The presence of high density dental amalgam in patient CT image data sets causes dose calculation errors for head and neck (HN) treatment planning. This study assesses and compares dosimetric variations in IMRT and VMAT treatment plans due to dental artifacts. Methods: Sixteen HN patients with similar treatment sites (oropharynx), tumor volume and extensive dental artifacts were divided into two groups: IMRT (n=8, 6 to 9 beams) and VMAT (n=8, 2 arcs with 352° rotation). All cases were planned with the Pinnacle 9.2 treatment planning software using the collapsed cone convolution superposition algorithm and a range of prescription dosemore » from 60 to 72Gy. Two different treatment plans were produced, each based on one of two image sets: (a)uncorrected; (b)dental artifacts density overridden (set to 1.0g/cm{sup 3}). Differences between the two treatment plans for each of the IMRT and VMAT techniques were quantified by the following dosimetric parameters: maximum point dose, maximum spinal cord and brainstem dose, mean left and right parotid dose, and PTV coverage (V95%Rx). Average differences generated for these dosimetric parameters were compared between IMRT and VMAT plans. Results: The average absolute dose differences (plan a minus plan b) for the VMAT and IMRT techniques, respectively, caused by dental artifacts were: 2.2±3.3cGy vs. 37.6±57.5cGy (maximum point dose, P=0.15); 1.2±0.9cGy vs. 7.9±6.7cGy (maximum spinal cord dose, P=0.026); 2.2±2.4cGy vs. 12.1±13.0cGy (maximum brainstem dose, P=0.077); 0.9±1.1cGy vs. 4.1±3.5cGy (mean left parotid dose, P=0.038); 0.9±0.8cGy vs. 7.8±11.9cGy (mean right parotid dose, P=0.136); 0.021%±0.014% vs. 0.803%±1.44% (PTV coverage, P=0.17). Conclusion: For the HN plans studied, dental artifacts demonstrated a greater dose calculation error for IMRT plans compared to VMAT plans. Rotational arcs appear on the average to compensate dose calculation errors induced by dental artifacts. Thus, compared to VMAT, density overrides for dental artifacts are more important when planning IMRT of HN.« less
Psycho-physiological effects of visual artifacts by stereoscopic display systems
NASA Astrophysics Data System (ADS)
Kim, Sanghyun; Yoshitake, Junki; Morikawa, Hiroyuki; Kawai, Takashi; Yamada, Osamu; Iguchi, Akihiko
2011-03-01
The methods available for delivering stereoscopic (3D) display using glasses can be classified as time-multiplexing and spatial-multiplexing. With both methods, intrinsic visual artifacts result from the generation of the 3D image pair on a flat panel display device. In the case of the time-multiplexing method, an observer perceives three artifacts: flicker, the Mach-Dvorak effect, and a phantom array. These only occur under certain conditions, with flicker appearing in any conditions, the Mach-Dvorak effect during smooth pursuit eye movements (SPM), and a phantom array during saccadic eye movements (saccade). With spatial-multiplexing, the artifacts are temporal-parallax (due to the interlaced video signal), binocular rivalry, and reduced spatial resolution. These artifacts are considered one of the major impediments to the safety and comfort of 3D display users. In this study, the implications of the artifacts for the safety and comfort are evaluated by examining the psychological changes they cause through subjective symptoms of fatigue and the depth sensation. Physiological changes are also measured as objective responses based on analysis of heart and brain activation by visual artifacts. Further, to understand the characteristics of each artifact and the combined effects of the artifacts, four experimental conditions are developed and tested. The results show that perception of artifacts differs according to the visual environment and the display method. Furthermore visual fatigue and the depth sensation are influenced by the individual characteristics of each artifact. Similarly, heart rate variability and regional cerebral oxygenation changes by perception of artifacts in conditions.
Educational software usability: Artifact or Design?
Van Nuland, Sonya E; Eagleson, Roy; Rogers, Kem A
2017-03-01
Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
Open Source Patient-Controlled Analgesic Pump Requirements Documentation
Larson, Brian R.; Hatcliff, John; Chalin, Patrice
2014-01-01
The dynamic nature of the medical domain is driving a need for continuous innovation and improvement in techniques for developing and assuring medical devices. Unfortunately, research in academia and communication between academics, industrial engineers, and regulatory authorities is hampered by the lack of realistic non-proprietary development artifacts for medical devices. In this paper, we give an overview of a detailed requirements document for a Patient-Controlled Analgesic (PCA) pump developed under the US NSF’s Food and Drug Administration (FDA) Scholar-in-Residence (SIR) program. This 60+ page document follows the methodology outlined in the US Federal Aviation Administrations (FAA) Requirements Engineering Management Handbook (REMH) and includes a domain overview, use cases, statements of safety & security requirements, and formal top-level system architectural description. Based on previous experience with release of a requirements document for a cardiac pacemaker that spawned a number of research and pedagogical activities, we believe that the described PCA requirements document can be an important research enabler within the formal methods and software engineering communities. PMID:24931440
Designing a Web-Based Learning Portal for Geographic Visualization and Analysis in Public Health
Robinson, Anthony C.; Roth, Robert E.; MacEachren, Alan M.
2011-01-01
Interactive mapping and spatial analysis tools are underutilized by health researchers and decision-makers due to scarce training materials, few examples demonstrating the successful use of geographic visualization, and poor mechanisms for sharing results generated by geovisualization. We report here on the development of the Geovisual EXplication (G-EX) Portal, a web-based application designed to connect researchers in geovisualization and related mapping sciences to users who are working in public health and epidemiology. This paper focuses on the design and development of the G-EX Portal Learn module, a set of tools intended to disseminate learning artifacts. Initial design and development of the G-EX Portal has been guided by our past research on use and usability of geovisualization in public health. As part of the iterative design and development process, we conducted a needs assessment survey with targeted end-users that we report on here. The survey focused on users’ current learning habits, their preferred kind of learning artifacts, and issues they may have with contributing learning artifacts to web portals. Survey results showed that users desire a diverse set of learning artifacts in terms of both formats and topics covered. Results also revealed a willingness of users to contribute both learning artifacts and personal information that would help other users to evaluate the credibility of the learning artifact source. We include a detailed description of the G-EX Portal Learn module and focus on modifications to the design of the Learn module as a result from feedback we received from our survey. PMID:21937462
NASA Astrophysics Data System (ADS)
Deprez, Hanne; Gransier, Robin; Hofmann, Michael; van Wieringen, Astrid; Wouters, Jan; Moonen, Marc
2018-02-01
Objective. Electrically evoked auditory steady-state responses (EASSRs) are potentially useful for objective cochlear implant (CI) fitting and follow-up of the auditory maturation in infants and children with a CI. EASSRs are recorded in the electro-encephalogram (EEG) in response to electrical stimulation with continuous pulse trains, and are distorted by significant CI artifacts related to this electrical stimulation. The aim of this study is to evaluate a CI artifacts attenuation method based on independent component analysis (ICA) for three EASSR datasets. Approach. ICA has often been used to remove CI artifacts from the EEG to record transient auditory responses, such as cortical evoked auditory potentials. Independent components (ICs) corresponding to CI artifacts are then often manually identified. In this study, an ICA based CI artifacts attenuation method was developed and evaluated for EASSR measurements with varying CI artifacts and EASSR characteristics. Artifactual ICs were automatically identified based on their spectrum. Main results. For 40 Hz amplitude modulation (AM) stimulation at comfort level, in high SNR recordings, ICA succeeded in removing CI artifacts from all recording channels, without distorting the EASSR. For lower SNR recordings, with 40 Hz AM stimulation at lower levels, or 90 Hz AM stimulation, ICA either distorted the EASSR or could not remove all CI artifacts in most subjects, except for two of the seven subjects tested with low level 40 Hz AM stimulation. Noise levels were reduced after ICA was applied, and up to 29 ICs were rejected, suggesting poor ICA separation quality. Significance. We hypothesize that ICA is capable of separating CI artifacts and EASSR in case the contralateral hemisphere is EASSR dominated. For small EASSRs or large CI artifact amplitudes, ICA separation quality is insufficient to ensure complete CI artifacts attenuation without EASSR distortion.
Metal artifact reduction in MRI-based cervical cancer intracavitary brachytherapy
NASA Astrophysics Data System (ADS)
Rao, Yuan James; Zoberi, Jacqueline E.; Kadbi, Mo; Grigsby, Perry W.; Cammin, Jochen; Mackey, Stacie L.; Garcia-Ramirez, Jose; Goddu, S. Murty; Schwarz, Julie K.; Gach, H. Michael
2017-04-01
Magnetic resonance imaging (MRI) plays an increasingly important role in brachytherapy planning for cervical cancer. Yet, metal tandem, ovoid intracavitary applicators, and fiducial markers used in brachytherapy cause magnetic susceptibility artifacts in standard MRI. These artifacts may impact the accuracy of brachytherapy treatment and the evaluation of tumor response by misrepresenting the size and location of the metal implant, and distorting the surrounding anatomy and tissue. Metal artifact reduction sequences (MARS) with high bandwidth RF selective excitations and turbo spin-echo readouts were developed for MRI of orthopedic implants. In this study, metal artifact reduction was applied to brachytherapy of cervical cancer using the orthopedic metal artifact reduction (O-MAR) sequence. O-MAR combined MARS features with view angle tilting and slice encoding for metal artifact correction (SEMAC) to minimize in-plane and through-plane susceptibility artifacts. O-MAR improved visualization of the tandem tip on T2 and proton density weighted (PDW) imaging in phantoms and accurately represented the diameter of the tandem. In a pilot group of cervical cancer patients (N = 7), O-MAR significantly minimized the blooming artifact at the tip of the tandem in PDW MRI. There was no significant difference observed in artifact reduction between the weak (5 kHz, 7 z-phase encodes) and medium (10 kHz, 13 z-phase encodes) SEMAC settings. However, the weak setting allowed a significantly shorter acquisition time than the medium setting. O-MAR also reduced susceptibility artifacts associated with metal fiducial markers so that they appeared on MRI at their true dimensions.
Application of Neutron Tomography in Culture Heritage research.
Mongy, T
2014-02-01
Neutron Tomography (NT) investigation of Culture Heritages (CH) is an efficient tool for understanding the culture of ancient civilizations. Neutron imaging (NI) is a-state-of-the-art non-destructive tool in the area of CH and plays an important role in the modern archeology. The NI technology can be widely utilized in the field of elemental analysis. At Egypt Second Research Reactor (ETRR-2), a collimated Neutron Radiography (NR) beam is employed for neutron imaging purposes. A digital CCD camera is utilized for recording the beam attenuation in the sample. This helps for the detection of hidden objects and characterization of material properties. Research activity can be extended to use computer software for quantitative neutron measurement. Development of image processing algorithms can be used to obtain high quality images. In this work, full description of ETRR-2 was introduced with up to date neutron imaging system as well. Tomographic investigation of a clay forged artifact represents CH object was studied by neutron imaging methods in order to obtain some hidden information and highlight some attractive quantitative measurements. Computer software was used for imaging processing and enhancement. Also the Astra Image 3.0 Pro software was employed for high precise measurements and imaging enhancement using advanced algorithms. This work increased the effective utilization of the ETRR-2 Neutron Radiography/Tomography (NR/T) technique in Culture Heritages activities. © 2013 Elsevier Ltd. All rights reserved.
A wavelet method for modeling and despiking motion artifacts from resting-state fMRI time series.
Patel, Ameera X; Kundu, Prantik; Rubinov, Mikail; Jones, P Simon; Vértes, Petra E; Ersche, Karen D; Suckling, John; Bullmore, Edward T
2014-07-15
The impact of in-scanner head movement on functional magnetic resonance imaging (fMRI) signals has long been established as undesirable. These effects have been traditionally corrected by methods such as linear regression of head movement parameters. However, a number of recent independent studies have demonstrated that these techniques are insufficient to remove motion confounds, and that even small movements can spuriously bias estimates of functional connectivity. Here we propose a new data-driven, spatially-adaptive, wavelet-based method for identifying, modeling, and removing non-stationary events in fMRI time series, caused by head movement, without the need for data scrubbing. This method involves the addition of just one extra step, the Wavelet Despike, in standard pre-processing pipelines. With this method, we demonstrate robust removal of a range of different motion artifacts and motion-related biases including distance-dependent connectivity artifacts, at a group and single-subject level, using a range of previously published and new diagnostic measures. The Wavelet Despike is able to accommodate the substantial spatial and temporal heterogeneity of motion artifacts and can consequently remove a range of high and low frequency artifacts from fMRI time series, that may be linearly or non-linearly related to physical movements. Our methods are demonstrated by the analysis of three cohorts of resting-state fMRI data, including two high-motion datasets: a previously published dataset on children (N=22) and a new dataset on adults with stimulant drug dependence (N=40). We conclude that there is a real risk of motion-related bias in connectivity analysis of fMRI data, but that this risk is generally manageable, by effective time series denoising strategies designed to attenuate synchronized signal transients induced by abrupt head movements. The Wavelet Despiking software described in this article is freely available for download at www.brainwavelet.org. Copyright © 2014. Published by Elsevier Inc.
A wavelet method for modeling and despiking motion artifacts from resting-state fMRI time series
Patel, Ameera X.; Kundu, Prantik; Rubinov, Mikail; Jones, P. Simon; Vértes, Petra E.; Ersche, Karen D.; Suckling, John; Bullmore, Edward T.
2014-01-01
The impact of in-scanner head movement on functional magnetic resonance imaging (fMRI) signals has long been established as undesirable. These effects have been traditionally corrected by methods such as linear regression of head movement parameters. However, a number of recent independent studies have demonstrated that these techniques are insufficient to remove motion confounds, and that even small movements can spuriously bias estimates of functional connectivity. Here we propose a new data-driven, spatially-adaptive, wavelet-based method for identifying, modeling, and removing non-stationary events in fMRI time series, caused by head movement, without the need for data scrubbing. This method involves the addition of just one extra step, the Wavelet Despike, in standard pre-processing pipelines. With this method, we demonstrate robust removal of a range of different motion artifacts and motion-related biases including distance-dependent connectivity artifacts, at a group and single-subject level, using a range of previously published and new diagnostic measures. The Wavelet Despike is able to accommodate the substantial spatial and temporal heterogeneity of motion artifacts and can consequently remove a range of high and low frequency artifacts from fMRI time series, that may be linearly or non-linearly related to physical movements. Our methods are demonstrated by the analysis of three cohorts of resting-state fMRI data, including two high-motion datasets: a previously published dataset on children (N = 22) and a new dataset on adults with stimulant drug dependence (N = 40). We conclude that there is a real risk of motion-related bias in connectivity analysis of fMRI data, but that this risk is generally manageable, by effective time series denoising strategies designed to attenuate synchronized signal transients induced by abrupt head movements. The Wavelet Despiking software described in this article is freely available for download at www.brainwavelet.org. PMID:24657353
SU-F-T-443: Quantification of Dosimetric Effects of Dental Metallic Implant On VMAT Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, C; Jiang, W; Feng, Y
Purpose: To evaluate the dosimetric impact of metallic implant that correlates with the size of targets and metallic implants and distance in between on volumetric-modulated arc therapy (VMAT) plans for head and neck (H&N) cancer patients with dental metallic implant. Methods: CT images of H&N cancer patients with dental metallic implant were used. Target volumes with different sizes and locations were contoured. Metal artifact regions excluding surrounding critical organs were outlined and assigned with CT numbers close to water (0HU). VMAT plans with half-arc, one-full-arc and two-full-arcs were constructed and same plans were applied to structure sets with and withoutmore » CT number assignment of metal artifact regions and compared. D95% was utilized to investigate PTV dose coverage and SNC Patient− Software was used for the analysis of dose distribution difference slice by slice. Results: For different targets sizes, variation of PTV dose coverage (Delta-D95%) with and without CT number replacement reduced with larger target volume for all half-arc, one-arc and two-arc VMAT plans even though there were no clinically significant differences. Additionally, there were no significant variations of the maximum percent difference (max.%diff) of dose distribution. With regard to the target location, Delta-D95% and max. %diff dropped with increasing distance between target and metallic implant. Furthermore, half-arc plans showed greater impact than one-arc plans, and two-arc plans had smallest influence for PTV dose coverage and dose distribution. Conclusion: The target size has less correlation of doseimetric impact than the target location relative to metallic implants. Plans with more arcs alleviate the dosimetric effect of metal artifact because of less contribution to the target dose from beams going through the regions with metallic artifacts. Incorrect CT number causes inaccurate dose distribution, therefore appropriately overwriting metallic artifact regions with reasonable CT numbers is recommended. More patient data are collected and under further analysis.« less
Avitabile, Catherine M; Harris, Matthew A; Doddasomayajula, Ravi S; Chopski, Steven G; Gillespie, Matthew J; Dori, Yoav; Glatz, Andrew C; Fogel, Mark A; Whitehead, Kevin K
2018-06-15
Little data are available on the accuracy of phase-contrast magnetic resonance imaging (PC-MRI) velocity mapping in the vicinity of intravascular metal stents other than nitinol stents. Therefore, we sought to determine this accuracy using in vitro experiments. An in vitro flow phantom was used with 3 stent types: (1) 316L stainless steel, (2) nitinol self-expanding, and (3) platinum-iridium. Steady and pulsatile flow was delivered with a magnetic resonance imaging-compatible pump (CardioFlow 5000, Shelley Medical, London, Ontario, Canada). Flows were measured using a transit time flow meter (ME13PXN, Transonic, Inc, Ithaca, New York). Mean flows ranged from 0.5 to 7 L/min. For each condition, 5 PC-MRI acquisitions were made: within the stent, immediately adjacent to both edges of the stent artifact, and 1 cm upstream and downstream of the artifact. Mean PC-MRI flows were calculated by segmenting the tube lumen using clinical software (ARGUS, Siemens, Inc, Erlangen, Germany). PC-MRI and flow meter flows were compared by location and stent type using linear regression, Bland-Altman, and intraclass correlation (ICC). PC-MRI flows within the stent artifact were inaccurate for all stents studied, generally underestimating flow meter-measured flow. Agreement between PC-MRI and flow meter-measured flows was excellent for all stent types, both immediately adjacent to and 1 cm away from the edge of the stent artifact. Agreement was highest for the platinum-iridium stent (R = 0.999, ICC = 0.999) and lowest for the nitinol stent (R = 0.993, ICC = 0.987). In conclusion, PC-MRI flows are highly accurate just upstream and downstream of a variety of clinically used stents, supporting its use to directly measure flows in stented vessels. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.
Mannan, Malik M Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M Ahmad
2016-02-19
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.
A gradient-boosting approach for filtering de novo mutations in parent-offspring trios.
Liu, Yongzhuang; Li, Bingshan; Tan, Renjie; Zhu, Xiaolin; Wang, Yadong
2014-07-01
Whole-genome and -exome sequencing on parent-offspring trios is a powerful approach to identifying disease-associated genes by detecting de novo mutations in patients. Accurate detection of de novo mutations from sequencing data is a critical step in trio-based genetic studies. Existing bioinformatic approaches usually yield high error rates due to sequencing artifacts and alignment issues, which may either miss true de novo mutations or call too many false ones, making downstream validation and analysis difficult. In particular, current approaches have much worse specificity than sensitivity, and developing effective filters to discriminate genuine from spurious de novo mutations remains an unsolved challenge. In this article, we curated 59 sequence features in whole genome and exome alignment context which are considered to be relevant to discriminating true de novo mutations from artifacts, and then employed a machine-learning approach to classify candidates as true or false de novo mutations. Specifically, we built a classifier, named De Novo Mutation Filter (DNMFilter), using gradient boosting as the classification algorithm. We built the training set using experimentally validated true and false de novo mutations as well as collected false de novo mutations from an in-house large-scale exome-sequencing project. We evaluated DNMFilter's theoretical performance and investigated relative importance of different sequence features on the classification accuracy. Finally, we applied DNMFilter on our in-house whole exome trios and one CEU trio from the 1000 Genomes Project and found that DNMFilter could be coupled with commonly used de novo mutation detection approaches as an effective filtering approach to significantly reduce false discovery rate without sacrificing sensitivity. The software DNMFilter implemented using a combination of Java and R is freely available from the website at http://humangenome.duke.edu/software. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Mediating Artifact in Teacher Professional Development
NASA Astrophysics Data System (ADS)
Svendsen, Bodil
2015-07-01
This article focuses on teacher professional development (TPD) in natural science through the 5E model as mediating artifact. The study was conducted in an upper secondary school, grounded in a school-based intervention research project. My contribution to the field of research on TPD is founded on the hypothesis that teachers would be best facilitated to make their practice more inquiry based if they are provided with a mediating artifact. In this study the artifact is a model 5E, which is a conceptual way of thinking, to help teachers reflect on their practice. The aim is to encourage teachers to make changes themselves, by applying extended use of inquiry into their practice. This mediated artifact could thus be used across different national contexts. The main research question is; how can the 5E model as a mediating artifact enhance TPD? The article addresses the processes of the use of the 5E model and its influence on teachers' perception of the model. This is in order for teachers to conceptualize their goals related to inquiry and scientific thinking, and to solve the problems involved in achieving those goals in their own contexts. The study concludes that, after the intervention, the teachers' approaches and strategies demonstrate greater emphasis on learning.
Ye-Lin, Yiyao; Alberola-Rubio, José; Perales, Alfredo
2014-01-01
Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique. PMID:24523828
Ye-Lin, Yiyao; Garcia-Casado, Javier; Prats-Boluda, Gema; Alberola-Rubio, José; Perales, Alfredo
2014-01-01
Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.
Lee, Hyun-Soo; Choi, Seung Hong; Park, Sung-Hong
2017-07-01
To develop single and double acquisition methods to compensate for artifacts from eddy currents and transient oscillations in balanced steady-state free precession (bSSFP) with centric phase-encoding (PE) order for magnetization-prepared bSSFP imaging. A single and four different double acquisition methods were developed and evaluated with Bloch equation simulations, phantom/in vivo experiments, and quantitative analyses. For the single acquisition method, multiple PE groups, each of which was composed of N linearly changing PE lines, were ordered in a pseudocentric manner for optimal contrast and minimal signal fluctuations. Double acquisition methods used complex averaging of two images that had opposite artifact patterns from different acquisition orders or from different numbers of dummy scans. Simulation results showed high sensitivity of eddy-current and transient-oscillation artifacts to off-resonance frequency and PE schemes. The artifacts were reduced with the PE-grouping with N values from 3 to 8, similar to or better than the conventional pairing scheme of N = 2. The proposed double acquisition methods removed the remaining artifacts significantly. The proposed methods conserved detailed structures in magnetization transfer imaging well, compared with the conventional methods. The proposed single and double acquisition methods can be useful for artifact-free magnetization-prepared bSSFP imaging with desired contrast and minimized dummy scans. Magn Reson Med 78:254-263, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji
2012-07-01
With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.
Striping artifact reduction in lunar orbiter mosaic images
Mlsna, P.A.; Becker, T.
2006-01-01
Photographic images of the moon from the 1960s Lunar Orbiter missions are being processed into maps for visual use. The analog nature of the images has produced numerous artifacts, the chief of which causes a vertical striping pattern in mosaic images formed from a series of filmstrips. Previous methods of stripe removal tended to introduce ringing and aliasing problems in the image data. This paper describes a recently developed alternative approach that succeeds at greatly reducing the striping artifacts while avoiding the creation of ringing and aliasing artifacts. The algorithm uses a one dimensional frequency domain step to deal with the periodic component of the striping artifact and a spatial domain step to handle the aperiodic residue. Several variations of the algorithm have been explored. Results, strengths, and remaining challenges are presented. ?? 2006 IEEE.
Individual and Joint Expert Judgments as Reference Standards in Artifact Detection
Verduijn, Marion; Peek, Niels; de Keizer, Nicolette F.; van Lieshout, Erik-Jan; de Pont, Anne-Cornelie J.M.; Schultz, Marcus J.; de Jonge, Evert; de Mol, Bas A.J.M.
2008-01-01
Objective To investigate the agreement among clinical experts in their judgments of monitoring data with respect to artifacts, and to examine the effect of reference standards that consist of individual and joint expert judgments on the performance of artifact filters. Design Individual judgments of four physicians, a majority vote judgment, and a consensus judgment were obtained for 30 time series of three monitoring variables: mean arterial blood pressure (ABPm), central venous pressure (CVP), and heart rate (HR). The individual and joint judgments were used to tune three existing automated filtering methods and to evaluate the performance of the resulting filters. Measurements The interrater agreement was calculated in terms of positive specific agreement (PSA). The performance of the artifact filters was quantified in terms of sensitivity and positive predictive value (PPV). Results PSA values between 0.33 and 0.85 were observed among clinical experts in their selection of artifacts, with relatively high values for CVP data. Artifact filters developed using judgments of individual experts were found to moderately generalize to new time series and other experts; sensitivity values ranged from 0.40 to 0.60 for ABPm and HR filters (PPV: 0.57–0.84), and from 0.63 to 0.80 for CVP filters (PPV: 0.71–0.86). A higher performance value for the filters was found for the three variable types when joint judgments were used for tuning the filtering methods. Conclusion Given the disagreement among experts in their individual judgment of monitoring data with respect to artifacts, the use of joint reference standards obtained from multiple experts is recommended for development of automatic artifact filters. PMID:18096912
Mediating Artifact in Teacher Professional Development
ERIC Educational Resources Information Center
Svendsen, Bodil
2015-01-01
This article focuses on teacher professional development (TPD) in natural science through the 5E model as mediating artifact. The study was conducted in an upper secondary school, grounded in a school-based intervention research project. My contribution to the field of research on TPD is founded on the hypothesis that teachers would be best…
Online EEG artifact removal for BCI applications by adaptive spatial filtering.
Guarnieri, Roberto; Marino, Marco; Barban, Federico; Ganzetti, Marco; Mantini, Dante
2018-06-28
The performance of brain computer interfaces (BCIs) based on electroencephalography (EEG) data strongly depends on the effective attenuation of artifacts that are mixed in the recordings. To address this problem, we have developed a novel online EEG artifact removal method for BCI applications, which combines blind source separation (BSS) and regression (REG) analysis. The BSS-REG method relies on the availability of a calibration dataset of limited duration for the initialization of a spatial filter using BSS. Online artifact removal is implemented by dynamically adjusting the spatial filter in the actual experiment, based on a linear regression technique. Our results showed that the BSS-REG method is capable of attenuating different kinds of artifacts, including ocular and muscular, while preserving true neural activity. Thanks to its low computational requirements, BSS-REG can be applied to low-density as well as high-density EEG data. We argue that BSS-REG may enable the development of novel BCI applications requiring high-density recordings, such as source-based neurofeedback and closed-loop neuromodulation. © 2018 IOP Publishing Ltd.
Improving the strength of additively manufactured objects via modified interior structure
NASA Astrophysics Data System (ADS)
Al, Can Mert; Yaman, Ulas
2017-10-01
Additive manufacturing (AM), in other words 3D printing, is becoming more common because of its crucial advantages such as geometric complexity, functional interior structures, etc. over traditional manufacturing methods. Especially, Fused Filament Fabrication (FFF) 3D printing technology is frequently used because of the fact that desktop variants of these types of printers are highly appropriate for different fields and are improving rapidly. In spite of the fact that there are significant advantages of AM, the strength of the parts fabricated with AM is still a major problem especially when plastic materials, such as Acrylonitrile butadiene styrene (ABS), Polylactic acid (PLA), Nylon, etc., are utilized. In this study, an alternative method is proposed in which the strength of AM fabricated parts is improved employing direct slicing approach. Traditional Computer Aided Manufacturing (CAM) software of 3D printers takes only the geometry as an input in triangular mesh form (stereolithography, STL file) generated by Computer Aided Design software. This file format includes data only about the outer boundaries of the geometry. Interior of the artifacts are manufactured with homogeneous infill patterns, such as diagonal, honeycomb, linear, etc. according to the paths generated in CAM software. The developed method within this study provides a way to fabricate parts with heterogeneous infill patterns by utilizing the stress field data obtained from a Finite Element Analysis software, such as ABAQUS. According to the performed tensile tests, the strength of the test specimen is improved by about 45% compared to the conventional way of 3D printing.
Clinical introduction of image lag correction for a cone beam CT system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stankovic, Uros; Ploeger, Lennert S.; Sonke, Jan-Jakob, E-mail: j.sonke@nki.nl
Purpose: Image lag in the flat-panel detector used for Linac integrated cone beam computed tomography (CBCT) has a degrading effect on CBCT image quality. The most prominent visible artifact is the presence of bright semicircular structure in the transverse view of the scans, known also as radar artifact. Several correction strategies have been proposed, but until now the clinical introduction of such corrections remains unreported. In November 2013, the authors have clinically implemented a previously proposed image lag correction on all of their machines at their main site in Amsterdam. The purpose of this study was to retrospectively evaluate themore » effect of the correction on the quality of CBCT images and evaluate the required calibration frequency. Methods: Image lag was measured in five clinical CBCT systems (Elekta Synergy 4.6) using an in-house developed beam interrupting device that stops the x-ray beam midway through the data acquisition of an unattenuated beam for calibration. A triple exponential falling edge response was fitted to the measured data and used to correct image lag from projection images with an infinite response. This filter, including an extrapolation for saturated pixels, was incorporated in the authors’ in-house developed clinical CBCT reconstruction software. To investigate the short-term stability of the lag and associated parameters, a series of five image lag measurement over a period of three months was performed. For quantitative analysis, the authors have retrospectively selected ten patients treated in the pelvic region. The apparent contrast was quantified in polar coordinates for scans reconstructed using the parameters obtained from different dates with and without saturation handling. Results: Visually, the radar artifact was minimal in scans reconstructed using image lag correction especially when saturation handling was used. In patient imaging, there was a significant reduction of the apparent contrast from 43 ± 16.7 to 15.5 ± 11.9 HU without the saturation handling and to 9.6 ± 12.1 HU with the saturation handling, depending on the date of the calibration. The image lag correction parameters were stable over a period of 3 months. The computational load was increased by approximately 10%, not endangering the fast in-line reconstruction. Conclusions: The lag correction was successfully implemented clinically and removed most image lag artifacts thus improving the image quality. Image lag correction parameters were stable for 3 months indicating low frequency of calibration requirements.« less
NASA UAS Traffic Management National Campaign Operations across Six UAS Test Sites
NASA Technical Reports Server (NTRS)
Rios, Joseph; Mulfinger, Daniel; Homola, Jeff; Venkatesan, Priya
2016-01-01
NASA's Unmanned Aircraft Systems Traffic Management research aims to develop policies, procedures, requirements, and other artifacts to inform the implementation of a future system that enables small drones to access the low altitude airspace. In this endeavor, NASA conducted a geographically diverse flight test in conjunction with the FAA's six unmanned aircraft systems Test Sites. A control center at NASA Ames Research Center autonomously managed the airspace for all participants in eight states as they flew operations (both real and simulated). The system allowed for common situational awareness across all stakeholders, kept traffic procedurally separated, offered messages to inform the participants of activity relevant to their operations. Over the 3- hour test, 102 flight operations connected to the central research platform with 17 different vehicle types and 8 distinct software client implementations while seamlessly interacting with simulated traffic.
NASA Astrophysics Data System (ADS)
Orngreen, Rikke; Clemmensen, Torkil; Pejtersen, Annelise Mark
The boundaries and work processes for how virtual teams interact are undergoing changes, from a tool and stand-alone application orientation, to the use of multiple generic platforms chosen and redesigned to the specific context. These are often at the same time designed both by professional software developers and the individual members of the virtual teams, rather than determined on a single organizational level. There may be no impact of the technology per se on individuals, groups or organizations, as the technology for virtual teams rather enhance situation ambiguity and disrupt existing task-artifact cycles. This ambiguous situation calls for new methods for empirical work analysis and interaction design that can help us understand how organizations, teams and individuals learn to organize, design and work in virtual teams in various networked contexts.
Sinha, Sumedha P; Goodsitt, Mitchell M; Roubidoux, Marilyn A; Booi, Rebecca C; LeCarpentier, Gerald L; Lashbrook, Christine R; Thomenius, Kai E; Chalek, Carl L; Carson, Paul L
2007-05-01
We are developing an automated ultrasound imaging-mammography system wherein a digital mammography unit has been augmented with a motorized ultrasound transducer carriage above a special compression paddle. Challenges of this system are acquiring complete coverage of the breast and minimizing motion. We assessed these problems and investigated methods to increase coverage and stabilize the compressed breast. Visual tracings of the breast-to-paddle contact area and breast periphery were made for 10 patients to estimate coverage area. Various motion artifacts were evaluated in 6 patients. Nine materials were tested for coupling the paddle to the breast. Fourteen substances were tested for coupling the transducer to the paddle in lateral-to-medial and medial-to-lateral views and filling the gap between the peripheral breast and paddle. In-house image registration software was used to register adjacent ultrasound sweeps. The average breast contact area was 56%. The average percentage of the peripheral air gap filled with ultrasound gel was 61%. Shallow patient breathing proved equivalent to breath holding, whereas speech and sudden breathing caused unacceptable artifacts. An adhesive spray that preserves image quality was found to be best for coupling the breast to the paddle and minimizing motion. A highly viscous ultrasound gel proved most effective for coupling the transducer to the paddle for lateral-to-medial and medial-to-lateral views and for edge fill-in. The challenges of automated ultrasound scanning in a multimodality breast imaging system have been addressed by developing methods to fill in peripheral gaps, minimize patient motion, and register and reconstruct multisweep ultrasound image volumes.
NASA Technical Reports Server (NTRS)
King, Ellis; Hart, Jeremy; Odegard, Ryan
2010-01-01
The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.
NASA Astrophysics Data System (ADS)
Allman, Derek; Reiter, Austin; Bell, Muyinatu
2018-02-01
We previously proposed a method of removing reflection artifacts in photoacoustic images that uses deep learning. Our approach generally relies on using simulated photoacoustic channel data to train a convolutional neural network (CNN) that is capable of distinguishing sources from artifacts based on unique differences in their spatial impulse responses (manifested as depth-based differences in wavefront shapes). In this paper, we directly compare a CNN trained with our previous continuous transducer model to a CNN trained with an updated discrete acoustic receiver model that more closely matches an experimental ultrasound transducer. These two CNNs were trained with simulated data and tested on experimental data. The CNN trained using the continuous receiver model correctly classified 100% of sources and 70.3% of artifacts in the experimental data. In contrast, the CNN trained using the discrete receiver model correctly classified 100% of sources and 89.7% of artifacts in the experimental images. The 19.4% increase in artifact classification accuracy indicates that an acoustic receiver model that closely mimics the experimental transducer plays an important role in improving the classification of artifacts in experimental photoacoustic data. Results are promising for developing a method to display CNN-based images that remove artifacts in addition to only displaying network-identified sources as previously proposed.
Reduction of metal artifacts in x-ray CT images using a convolutional neural network
NASA Astrophysics Data System (ADS)
Zhang, Yanbo; Chu, Ying; Yu, Hengyong
2017-09-01
Patients usually contain various metallic implants (e.g. dental fillings, prostheses), causing severe artifacts in the x-ray CT images. Although a large number of metal artifact reduction (MAR) methods have been proposed in the past four decades, MAR is still one of the major problems in clinical x-ray CT. In this work, we develop a convolutional neural network (CNN) based MAR framework, which combines the information from the original and corrected images to suppress artifacts. Before the MAR, we generate a group of data and train a CNN. First, we numerically simulate various metal artifacts cases and build a dataset, which includes metal-free images (used as references), metal-inserted images and various MAR methods corrected images. Then, ten thousands patches are extracted from the databased to train the metal artifact reduction CNN. In the MAR stage, the original image and two corrected images are stacked as a three-channel input image for CNN, and a CNN image is generated with less artifacts. The water equivalent regions in the CNN image are set to a uniform value to yield a CNN prior, whose forward projections are used to replace the metal affected projections, followed by the FBP reconstruction. Experimental results demonstrate the superior metal artifact reduction capability of the proposed method to its competitors.
Digital data collection in paleoanthropology.
Reed, Denné; Barr, W Andrew; Mcpherron, Shannon P; Bobe, René; Geraads, Denis; Wynn, Jonathan G; Alemseged, Zeresenay
2015-01-01
Understanding patterns of human evolution across space and time requires synthesizing data collected by independent research teams, and this effort is part of a larger trend to develop cyber infrastructure and e-science initiatives. At present, paleoanthropology cannot easily answer basic questions about the total number of fossils and artifacts that have been discovered, or exactly how those items were collected. In this paper, we examine the methodological challenges to data integration, with the hope that mitigating the technical obstacles will further promote data sharing. At a minimum, data integration efforts must document what data exist and how the data were collected (discovery), after which we can begin standardizing data collection practices with the aim of achieving combined analyses (synthesis). This paper outlines a digital data collection system for paleoanthropology. We review the relevant data management principles for a general audience and supplement this with technical details drawn from over 15 years of paleontological and archeological field experience in Africa and Europe. The system outlined here emphasizes free open-source software (FOSS) solutions that work on multiple computer platforms; it builds on recent advances in open-source geospatial software and mobile computing. © 2015 Wiley Periodicals, Inc.
Analysis of Phoenix Anomalies and IV & V Findings Applied to the GRAIL Mission
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
NASA IV&V was established in 1993 to improve safety and cost-effectiveness of mission critical software. Since its inception the tools and strategies employed by IV&V have evolved. This paper examines how lessons learned from the Phoenix project were developed and applied to the GRAIL project. Shortly after selection, the GRAIL project initiated a review of the issues documented by IV&V for Phoenix. The motivation was twofold: the learn as much as possible about the types of issues that arose from the flight software product line slated for use on GRAIL, and to identify opportunities for improving the effectiveness of IV&V on GRAIL. The IV&V Facility provided a database dump containing 893 issues. These were categorized into 16 bins, and then analyzed according to whether the project responded by changing the affected artifacts or using as-is. The results of this analysis were compared to a similar assessment of post-launch anomalies documented by the project. Results of the analysis were discussed with the IV&V team assigned to GRAIL. These discussions led to changes in the way both the project and IV&V approached the IV&V task, and improved the efficiency of the activity.
CrossTalk. The Journal of Defense Software Engineering. Volume 14, Number 5, May 2001
2001-05-01
REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10 . SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12... 10 HIGHLIGHTED IN AN APRIL 30 TUTORIAL AND ON MAY 1 - TRACK 8 STEVEN R. PERKINS IS A PLENARY SPEAKER ON MAY 2 PATRICK J. SCHROEDER WILL PRESENT ON MAY...Capability - the first released version. (The artifacts for each are provided by an electronic process guide [ 10 ] and are also used by the Rational Unified
Biobeam—Multiplexed wave-optical simulations of light-sheet microscopy
Weigert, Martin; Bundschuh, Sebastian T.
2018-01-01
Sample-induced image-degradation remains an intricate wave-optical problem in light-sheet microscopy. Here we present biobeam, an open-source software package that enables simulation of operational light-sheet microscopes by combining data from 105–106 multiplexed and GPU-accelerated point-spread-function calculations. The wave-optical nature of these simulations leads to the faithful reproduction of spatially varying aberrations, diffraction artifacts, geometric image distortions, adaptive optics, and emergent wave-optical phenomena, and renders image-formation in light-sheet microscopy computationally tractable. PMID:29652879
Approaches to reducing photon dose calculation errors near metal implants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Jessie Y.; Followill, David S.; Howell, Reb
Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well asmore » two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact reduction methods investigated, the authors found that O-MAR was the most consistent method, resulting in either improved dose calculation accuracy (dental case) or little impact on calculation accuracy (spine case). GSI was unsuccessful at reducing the severe artifacts caused by dental fillings and had very little impact on calculation accuracy. GSI with MARS on the other hand gave mixed results, sometimes introducing metal distortion and increasing calculation errors (titanium rectangular implant and titanium spinal hardware) but other times very successfully reducing artifacts (Cerrobend rectangular implant and dental fillings). Conclusions: Though successful at improving dose calculation accuracy upstream of metal implants, metal kernels were not found to substantially improve accuracy for clinical cases. Of the commercial artifact reduction methods investigated, O-MAR was found to be the most consistent candidate for all-purpose CT simulation imaging. The MARS algorithm for GSI should be used with caution for titanium implants, larger implants, and implants located near heterogeneities as it can distort the size and shape of implants and increase calculation errors.« less
An extension to artifact-free projection overlaps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jianyu, E-mail: jianyulin@hotmail.com
2015-05-15
Purpose: In multipinhole single photon emission computed tomography, the overlapping of projections has been used to increase sensitivity. Avoiding artifacts in the reconstructed image associated with projection overlaps (multiplexing) is a critical issue. In our previous report, two types of artifact-free projection overlaps, i.e., projection overlaps that do not lead to artifacts in the reconstructed image, were formally defined and proved, and were validated via simulations. In this work, a new proposition is introduced to extend the previously defined type-II artifact-free projection overlaps so that a broader range of artifact-free overlaps is accommodated. One practical purpose of the new extensionmore » is to design a baffle window multipinhole system with artifact-free projection overlaps. Methods: First, the extended type-II artifact-free overlap was theoretically defined and proved. The new proposition accommodates the situation where the extended type-II artifact-free projection overlaps can be produced with incorrectly reconstructed portions in the reconstructed image. Next, to validate the theory, the extended-type-II artifact-free overlaps were employed in designing the multiplexing multipinhole spiral orbit imaging systems with a baffle window. Numerical validations were performed via simulations, where the corresponding 1-pinhole nonmultiplexing reconstruction results were used as the benchmark for artifact-free reconstructions. The mean square error (MSE) was the metric used for comparisons of noise-free reconstructed images. Noisy reconstructions were also performed as part of the validations. Results: Simulation results show that for noise-free reconstructions, the MSEs of the reconstructed images of the artifact-free multiplexing systems are very similar to those of the corresponding 1-pinhole systems. No artifacts were observed in the reconstructed images. Therefore, the testing results for artifact-free multiplexing systems designed using the extended type-II artifact-free overlaps numerically validated the developed theory. Conclusions: First, the extension itself is of theoretical importance because it broadens the selection range for optimizing multiplexing multipinhole designs. Second, the extension has an immediate application: using a baffle window to design a special spiral orbit multipinhole imaging system with projection overlaps in the orbit axial direction. Such an artifact-free baffle window design makes it possible for us to image any axial portion of interest of a long object with projection overlaps to increase sensitivity.« less
Yildiz, Yesna O; Eckersley, Robert J; Senior, Roxy; Lim, Adrian K P; Cosgrove, David; Tang, Meng-Xing
2015-07-01
Non-linear propagation of ultrasound creates artifacts in contrast-enhanced ultrasound images that significantly affect both qualitative and quantitative assessments of tissue perfusion. This article describes the development and evaluation of a new algorithm to correct for this artifact. The correction is a post-processing method that estimates and removes non-linear artifact in the contrast-specific image using the simultaneously acquired B-mode image data. The method is evaluated on carotid artery flow phantoms with large and small vessels containing microbubbles of various concentrations at different acoustic pressures. The algorithm significantly reduces non-linear artifacts while maintaining the contrast signal from bubbles to increase the contrast-to-tissue ratio by up to 11 dB. Contrast signal from a small vessel 600 μm in diameter buried in tissue artifacts before correction was recovered after the correction. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Di Tuccio, Maria Concetta; De Grandi, Sandro; Vivarelli, Arianna; Becherini, Francesca; Pockelé, Luc; Bernardi, Adriana
2015-04-01
To conserve the work of arts (paintings, sculptures, etc..) in a preventive mode, a careful monitoring of the environment around these artifacts, as well as of their surface temperature, is necessary. The latter is the only physical variable which can be measured in a non-invasive way, following directly the thermal conditions and variations of the work of arts due to the dynamics of the microclimate. Considering that the works of art are often untouchable, an automated and accurate remote sensing could be very useful to prevent dangerous processes of deterioration. For these reasons a new sensor has been developed by a spin-off of the ISAC - CNR. This sensor allows to check in real-time the surface temperature changes of the artifacts both over time and at different predefined points. This automated sensor is a radiometer sensible to wavelengths ranging from 7,5 µm to 13,4 µm. A system rotating over three dimension "pan and tilt" allows to make multiple measures on a grid of points previously defined on the surface of the work of arts. The accuracy, obtained by means of a carefull calibration process, is 0,5 °C, more precise than the usual remote sensing (thermal camera and commercial radiometers), characterized by an accuracy value of 2°C. In order to obtain accurate measures of the surface temperature for a real body, the correct emissivity values need to be integrated in the calculation. Hence, an easy to use management software has been developed allowing to set the emissivity value in each point of the grid. For rejoinable points of the surface, the exact emissivity value could be determined comparing the measurements recorded by the new infrared sensor with the ones obtained by a very sensitive sensor (0,02 - 0,03)°C manually placed on the surface for a short time. In case of work of arts placed at great distance from the sersor, the emissivity values must be determined previously. The emissivity depends on a lot of variables and one of them is the surface roughness. Since the artifacts are often charaterized by a high surface roughness, such dependence has been studied in order to obtain accurate temperature measurements. The results obtained indicate an increase of the emissivity with increasing surface roughness. In conclusion, this study has allowed to develop a reliable, accurate and automatic control system, as well as a low cost sensor that, unlike the thermal camera, can also be used by less experienced operators. Besides, in order to support the museums managers in the preventive conservation of the artifacts, an alarm system is automatically activated when dangerously large thermal variations on the surface are detected.
Jamzad, Amoon; Setarehdan, Seyed Kamaledin
2014-04-01
The twinkling artifact is an undesired phenomenon within color Doppler sonograms that usually appears at the site of internal calcifications. Since the appearance of the twinkling artifact is correlated with the roughness of the calculi, noninvasive roughness estimation of the internal stones may be considered as a potential twinkling artifact application. This article proposes a novel quantitative approach for measurement and analysis of twinkling artifact data for roughness estimation. A phantom was developed with 7 quantified levels of roughness. The Doppler system was initially calibrated by the proposed procedure to facilitate the analysis. A total of 1050 twinkling artifact images were acquired from the phantom, and 32 novel numerical measures were introduced and computed for each image. The measures were then ranked on the basis of roughness quantification ability using different methods. The performance of the proposed twinkling artifact-based surface roughness quantification method was finally investigated for different combinations of features and classifiers. Eleven features were shown to be the most efficient numerical twinkling artifact measures in roughness characterization. The linear classifier outperformed other methods for twinkling artifact classification. The pixel count measures produced better results among the other categories. The sequential selection method showed higher accuracy than other individual rankings. The best roughness recognition average accuracy of 98.33% was obtained by the first 5 principle components and the linear classifier. The proposed twinkling artifact analysis method could recognize the phantom surface roughness with average accuracy of 98.33%. This method may also be applicable for noninvasive calculi characterization in treatment management.
Analyzing Cultural Artifacts for the Introduction, Perpetuation, or Reinforcement of Moral Ideals
ERIC Educational Resources Information Center
Williams, Jennifer
2013-01-01
The development and socialization of morals is a complex concept for students studying ethics. To help students understand the role social learning theory plays in the development of morality, an activity was created focusing on cultural artifacts and their introduction, perpetuation, and/or reinforcement of morality. The aim of this assignment is…
Giraudo, Chiara; Motyka, Stanislav; Weber, Michael; Resinger, Christoph; Thorsten, Feiweier; Traxler, Hannes; Trattnig, Siegfried; Bogner, Wolfgang
2017-08-01
The aim of this study was to investigate the origin of random image artifacts in stimulated echo acquisition mode diffusion tensor imaging (STEAM-DTI), assess the role of averaging, develop an automated artifact postprocessing correction method using weighted mean of signal intensities (WMSIs), and compare it with other correction techniques. Institutional review board approval and written informed consent were obtained. The right calf and thigh of 10 volunteers were scanned on a 3 T magnetic resonance imaging scanner using a STEAM-DTI sequence.Artifacts (ie, signal loss) in STEAM-based DTI, presumably caused by involuntary muscle contractions, were investigated in volunteers and ex vivo (ie, human cadaver calf and turkey leg using the same DTI parameters as for the volunteers). An automated postprocessing artifact correction method based on the WMSI was developed and compared with previous approaches (ie, iteratively reweighted linear least squares and informed robust estimation of tensors by outlier rejection [iRESTORE]). Diffusion tensor imaging and fiber tracking metrics, using different averages and artifact corrections, were compared for region of interest- and mask-based analyses. One-way repeated measures analysis of variance with Greenhouse-Geisser correction and Bonferroni post hoc tests were used to evaluate differences among all tested conditions. Qualitative assessment (ie, images quality) for native and corrected images was performed using the paired t test. Randomly localized and shaped artifacts affected all volunteer data sets. Artifact burden during voluntary muscle contractions increased on average from 23.1% to 77.5% but were absent ex vivo. Diffusion tensor imaging metrics (mean diffusivity, fractional anisotropy, radial diffusivity, and axial diffusivity) had a heterogeneous behavior, but in the range reported by literature. Fiber track metrics (number, length, and volume) significantly improved in both calves and thighs after artifact correction in region of interest- and mask-based analyses (P < 0.05 each). Iteratively reweighted linear least squares and iRESTORE showed equivalent results, but WMSI was faster than iRESTORE. Muscle delineation and artifact load significantly improved after correction (P < 0.05 each). Weighted mean of signal intensity correction significantly improved STEAM-based quantitative DTI analyses and fiber tracking of lower-limb muscles, providing a robust tool for musculoskeletal applications.
A Robust Post-Processing Workflow for Datasets with Motion Artifacts in Diffusion Kurtosis Imaging
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X.; Wan, Mingxi
2014-01-01
Purpose The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). Materials and methods The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). Results The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). Conclusion The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements. PMID:24727862
A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi
2014-01-01
The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements.
Huang, Chih-Sheng; Yang, Wen-Yu; Chuang, Chun-Hsiang; Wang, Yu-Kai
2018-01-01
Electroencephalogram (EEG) signals are usually contaminated with various artifacts, such as signal associated with muscle activity, eye movement, and body motion, which have a noncerebral origin. The amplitude of such artifacts is larger than that of the electrical activity of the brain, so they mask the cortical signals of interest, resulting in biased analysis and interpretation. Several blind source separation methods have been developed to remove artifacts from the EEG recordings. However, the iterative process for measuring separation within multichannel recordings is computationally intractable. Moreover, manually excluding the artifact components requires a time-consuming offline process. This work proposes a real-time artifact removal algorithm that is based on canonical correlation analysis (CCA), feature extraction, and the Gaussian mixture model (GMM) to improve the quality of EEG signals. The CCA was used to decompose EEG signals into components followed by feature extraction to extract representative features and GMM to cluster these features into groups to recognize and remove artifacts. The feasibility of the proposed algorithm was demonstrated by effectively removing artifacts caused by blinks, head/body movement, and chewing from EEG recordings while preserving the temporal and spectral characteristics of the signals that are important to cognitive research. PMID:29599950
Mannan, Malik M. Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M. Ahmad
2016-01-01
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data. PMID:26907276
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
ER2OWL: Generating OWL Ontology from ER Diagram
NASA Astrophysics Data System (ADS)
Fahad, Muhammad
Ontology is the fundamental part of Semantic Web. The goal of W3C is to bring the web into (its full potential) a semantic web with reusing previous systems and artifacts. Most legacy systems have been documented in structural analysis and structured design (SASD), especially in simple or Extended ER Diagram (ERD). Such systems need up-gradation to become the part of semantic web. In this paper, we present ERD to OWL-DL ontology transformation rules at concrete level. These rules facilitate an easy and understandable transformation from ERD to OWL. The set of rules for transformation is tested on a structured analysis and design example. The framework provides OWL ontology for semantic web fundamental. This framework helps software engineers in upgrading the structured analysis and design artifact ERD, to components of semantic web. Moreover our transformation tool, ER2OWL, reduces the cost and time for building OWL ontologies with the reuse of existing entity relationship models.
Automating Traceability for Generated Software Artifacts
NASA Technical Reports Server (NTRS)
Richardson, Julian; Green, Jeffrey
2004-01-01
Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.
V&V of Fault Management: Challenges and Successes
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Costello, Ken; Ohi, Don; Lu, Tiffany; Newhouse, Marilyn
2013-01-01
This paper describes the results of a special breakout session of the NASA Independent Verification and Validation (IV&V) Workshop held in the fall of 2012 entitled "V&V of Fault Management: Challenges and Successes." The NASA IV&V Program is in a unique position to interact with projects across all of the NASA development domains. Using this unique opportunity, the IV&V program convened a breakout session to enable IV&V teams to share their challenges and successes with respect to the V&V of Fault Management (FM) architectures and software. The presentations and discussions provided practical examples of pitfalls encountered while performing V&V of FM including the lack of consistent designs for implementing faults monitors and the fact that FM information is not centralized but scattered among many diverse project artifacts. The discussions also solidified the need for an early commitment to developing FM in parallel with the spacecraft systems as well as clearly defining FM terminology within a project.
Real-time pulse oximetry artifact annotation on computerized anaesthetic records.
Gostt, Richard Karl; Rathbone, Graeme Dennis; Tucker, Adam Paul
2002-01-01
Adoption of computerised anaesthesia record keeping systems has been limited by the concern that they record artifactual data and accurate data indiscriminately. Data resulting from artifacts does not reflect the patient's true condition and presents a problem in later analysis of the record, with associated medico-legal implications. This study developed an algorithm to automatically annotate pulse oximetry artifacts and sought to evaluate the algorithm's accuracy in routine surgical procedures. MacAnaesthetist is a semi-automatic anaesthetic record keeping system developed for the Apple Macintosh computer, which incorporated an algorithm designed to automatically detect pulse oximetry artifacts. The algorithm labeled artifactual oxygen saturation values < 90%. This was done in real-time by analyzing physiological data captured from a Datex AS/3 Anaesthesia Monitor. An observational study was conducted to evaluate the accuracy of the algorithm during routine surgical procedures (n = 20). An anaesthetic record was made by an anaesthetist using the Datex AS/3 record keeper, while a second anaesthetic record was produced in parallel using MacAnaesthetist. A copy of the Datex AS/3 record was kept for later review by a group of anaesthetists (n = 20), who judged oxygen saturation values < 90% to be either genuine or artifact. MacAnaesthetist correctly labeled 12 out of 13 oxygen saturations < 90% (92.3% accuracy). A post-operative review of the Datex AS/3 anaesthetic records (n = 8) by twenty anaesthetists resulted in 127 correct responses out of total of 200 (63.5% accuracy). The remaining Datex AS/3 records (n = 12) were not reviewed, as they did not contain any oxygen saturations <90%. The real-time artifact detection algorithm developed in this study was more accurate than anaesthetists who post-operatively reviewed records produced by an existing computerised anaesthesia record keeping system. Algorithms have the potential to more accurately identify and annotate artifacts on computerised anaesthetic records, assisting clinicians to more correctly interpret abnormal data.
Manufacture and calibration of optical supersmooth roughness artifacts for intercomparisons
NASA Astrophysics Data System (ADS)
Ringel, Gabriele A.; Kratz, Frank; Schmitt, Dirk-Roger; Mangelsdorf, Juergen; Creuzet, Francois; Garratt, John D.
1995-09-01
Intercomparison roughness measurements have been carried out on supersmooth artifacts fabricated from BK7, fused silica, and Zerodur. The surface parameters were determined using the optical heterodyne profiler Z5500 (Zygo), a special prototype of the mechanical profiler Nanostep (Rank Taylor Hobson), and an Atomic Force Microscope (Park Scientific Instruments) with an improved acquisition technique. The intercomparison was performed after the range of collected spatial wavelengths for each instrument was adjusted using digital filtering techniques. It is demonstrated for different roughness ranges that the applied superpolishing techniques yield supersmooth artifacts which can be used for more intercomparisons. More than 100 samples were investigated. Criteria were developed to select artifacts from the sample stock.
Rivera-Rivera, Carlos J.; Montoya-Burgos, Juan I.
2016-01-01
Phylogenetic inference artifacts can occur when sequence evolution deviates from assumptions made by the models used to analyze them. The combination of strong model assumption violations and highly heterogeneous lineage evolutionary rates can become problematic in phylogenetic inference, and lead to the well-described long-branch attraction (LBA) artifact. Here, we define an objective criterion for assessing lineage evolutionary rate heterogeneity among predefined lineages: the result of a likelihood ratio test between a model in which the lineages evolve at the same rate (homogeneous model) and a model in which different lineage rates are allowed (heterogeneous model). We implement this criterion in the algorithm Locus Specific Sequence Subsampling (LS³), aimed at reducing the effects of LBA in multi-gene datasets. For each gene, LS³ sequentially removes the fastest-evolving taxon of the ingroup and tests for lineage rate homogeneity until all lineages have uniform evolutionary rates. The sequences excluded from the homogeneously evolving taxon subset are flagged as potentially problematic. The software implementation provides the user with the possibility to remove the flagged sequences for generating a new concatenated alignment. We tested LS³ with simulations and two real datasets containing LBA artifacts: a nucleotide dataset regarding the position of Glires within mammals and an amino-acid dataset concerning the position of nematodes within bilaterians. The initially incorrect phylogenies were corrected in all cases upon removing data flagged by LS³. PMID:26912812
Activity inference for Ambient Intelligence through handling artifacts in a healthcare environment.
Martínez-Pérez, Francisco E; González-Fraga, Jose Ángel; Cuevas-Tello, Juan C; Rodríguez, Marcela D
2012-01-01
Human activity inference is not a simple process due to distinct ways of performing it. Our proposal presents the SCAN framework for activity inference. SCAN is divided into three modules: (1) artifact recognition, (2) activity inference, and (3) activity representation, integrating three important elements of Ambient Intelligence (AmI) (artifact-behavior modeling, event interpretation and context extraction). The framework extends the roaming beat (RB) concept by obtaining the representation using three kinds of technologies for activity inference. The RB is based on both analysis and recognition from artifact behavior for activity inference. A practical case is shown in a nursing home where a system affording 91.35% effectiveness was implemented in situ. Three examples are shown using RB representation for activity representation. Framework description, RB description and CALog system overcome distinct problems such as the feasibility to implement AmI systems, and to show the feasibility for accomplishing the challenges related to activity recognition based on artifact recognition. We discuss how the use of RBs might positively impact the problems faced by designers and developers for recovering information in an easier manner and thus they can develop tools focused on the user.
Activity Inference for Ambient Intelligence Through Handling Artifacts in a Healthcare Environment
Martínez-Pérez, Francisco E.; González-Fraga, Jose Ángel; Cuevas-Tello, Juan C.; Rodríguez, Marcela D.
2012-01-01
Human activity inference is not a simple process due to distinct ways of performing it. Our proposal presents the SCAN framework for activity inference. SCAN is divided into three modules: (1) artifact recognition, (2) activity inference, and (3) activity representation, integrating three important elements of Ambient Intelligence (AmI) (artifact-behavior modeling, event interpretation and context extraction). The framework extends the roaming beat (RB) concept by obtaining the representation using three kinds of technologies for activity inference. The RB is based on both analysis and recognition from artifact behavior for activity inference. A practical case is shown in a nursing home where a system affording 91.35% effectiveness was implemented in situ. Three examples are shown using RB representation for activity representation. Framework description, RB description and CALog system overcome distinct problems such as the feasibility to implement AmI systems, and to show the feasibility for accomplishing the challenges related to activity recognition based on artifact recognition. We discuss how the use of RBs might positively impact the problems faced by designers and developers for recovering information in an easier manner and thus they can develop tools focused on the user. PMID:22368512
Morphologic analysis of artifacts in human fetal eyes confounding histopathologic investigations.
Herwig, Martina C; Müller, Annette M; Holz, Frank G; Loeffler, Karin U
2011-04-25
Human fetal eyes are an excellent source for studies of the normal ocular development and for examining early ocular changes associated with various syndromes in the context of a pediatric pathologic or prenatal sonographic diagnosis. However, artifacts caused by different factors often render an exact interpretation difficult. In this study, the frequency and extent of artifacts in human fetal eyes were investigated with the aim of distinguishing more precisely these artifacts from real findings, allowing also for a more diligent forensic interpretation. The cohort included 341 fetal eyes, ranging in age from 8 to 38 weeks of gestation, that were investigated macroscopically and by light microscopy. In most specimens, artifacts such as pigment spillage and autolytic changes of the retina were noted. Nearly all specimens showed changes of the lens with remarkable similarities to cataractous lenses in adult eyes. Structural ocular changes associated with systemic syndromes were also observed and in most instances could be distinguished from artifacts. Morphologic changes in fetal eyes should be classified in artifacts caused by way of abortion, mechanical effects from the removal of the eyes, delayed fixation with autolysis, and the fixative itself and should be distinguished from genuine structural abnormalities associated with ocular or systemic disease. This classification can be fairly difficult and requires experience. In addition, lens artifacts are often misleading, and the diagnosis of a fetal cataract should not be made based on histopathologic examination alone.
Effects of Filtering on Experimental Blast Overpressure Measurements.
Alphonse, Vanessa D; Kemper, Andrew R; Duma, Stefan M
2015-01-01
When access to live-fire test facilities is limited, experimental studies of blast-related injuries necessitate the use of a shock tube or Advanced Blast Simulator (ABS) to mimic free-field blast overpressure. However, modeling blast overpressure in a laboratory setting potentially introduces experimental artifacts in measured responses. Due to the high sampling rates required to capture a blast overpressure event, proximity to alternating current (AC-powered electronics) and poorly strain-relieved or unshielded wires can result in artifacts in the recorded overpressure trace. Data in this study were collected for tests conducted on an empty ABS (Empty Tube) using high frequency pressure sensors specifically designed for blast loading rates (n=5). Additionally, intraocular overpressure data (IOP) were collected for porcine eyes potted inside synthetic orbits located inside the ABS using an unshielded miniature pressure sensor (n=3). All tests were conducted at a 30 psi static overpressure level. A 4th order phaseless low pass Butterworth software filter was applied to the data. Various cutoff frequencies were examined to determine if the raw shock wave parameters values could be preserved while eliminating noise and artifacts. A Fast Fourier Transform (FFT) was applied to each test to examine the frequency spectra of the raw and filtered signals. Shock wave parameters (time of arrival, peak overpressure, positive duration, and positive impulse) were quantified using a custom MATLAB® script. Lower cutoff frequencies attenuated the raw signal, effectively decreasing the peak overpressure and increasing the positive duration. Rise time was not preserved the filtered data. A CFC 6000 filter preserved the remaining shock wave parameters within ±2.5% of the average raw values for the Empty Tube test data. A CFC 7000 filter removed experimental high-frequency artifacts and preserved the remaining shock wave parameters within ±2.5% of the average raw values for test IOP test data. Though the region of interest of the signals examined in the current study did not contain extremely high frequency content, it is possible that live-fire testing may produce shock waves with higher frequency content. While post-processing filtering can remove experimental artifacts, special care should be taken to minimize or eliminate the possibility of recording these artifacts in the first place.
Physiological artifacts in scalp EEG and ear-EEG.
Kappel, Simon L; Looney, David; Mandic, Danilo P; Kidmose, Preben
2017-08-11
A problem inherent to recording EEG is the interference arising from noise and artifacts. While in a laboratory environment, artifacts and interference can, to a large extent, be avoided or controlled, in real-life scenarios this is a challenge. Ear-EEG is a concept where EEG is acquired from electrodes in the ear. We present a characterization of physiological artifacts generated in a controlled environment for nine subjects. The influence of the artifacts was quantified in terms of the signal-to-noise ratio (SNR) deterioration of the auditory steady-state response. Alpha band modulation was also studied in an open/closed eyes paradigm. Artifacts related to jaw muscle contractions were present all over the scalp and in the ear, with the highest SNR deteriorations in the gamma band. The SNR deterioration for jaw artifacts were in general higher in the ear compared to the scalp. Whereas eye-blinking did not influence the SNR in the ear, it was significant for all groups of scalps electrodes in the delta and theta bands. Eye movements resulted in statistical significant SNR deterioration in both frontal, temporal and ear electrodes. Recordings of alpha band modulation showed increased power and coherence of the EEG for ear and scalp electrodes in the closed-eyes periods. Ear-EEG is a method developed for unobtrusive and discreet recording over long periods of time and in real-life environments. This study investigated the influence of the most important types of physiological artifacts, and demonstrated that spontaneous activity, in terms of alpha band oscillations, could be recorded from the ear-EEG platform. In its present form ear-EEG was more prone to jaw related artifacts and less prone to eye-blinking artifacts compared to state-of-the-art scalp based systems.
New prospective 4D-CT for mitigating the effects of irregular respiratory motion
NASA Astrophysics Data System (ADS)
Pan, Tinsu; Martin, Rachael M.; Luo, Dershan
2017-08-01
Artifact caused by irregular respiration is a major source of error in 4D-CT imaging. We propose a new prospective 4D-CT to mitigate this source of error without new hardware, software or off-line data-processing on the GE CT scanner. We utilize the cine CT scan in the design of the new prospective 4D-CT. The cine CT scan at each position can be stopped by the operator when an irregular respiration occurs, and resumed when the respiration becomes regular. This process can be repeated at one or multiple scan positions. After the scan, a retrospective reconstruction is initiated on the CT console to reconstruct only the images corresponding to the regular respiratory cycles. The end result is a 4D-CT free of irregular respiration. To prove feasibility, we conducted a phantom and six patient studies. The artifacts associated with the irregular respiratory cycles could be removed from both the phantom and patient studies. A new prospective 4D-CT scanning and processing technique to mitigate the impact of irregular respiration in 4D-CT has been demonstrated. This technique can save radiation dose because the repeat scans are only at the scan positions where an irregular respiration occurs. Current practice is to repeat the scans at all positions. There is no cost to apply this technique because it is applicable on the GE CT scanner without new hardware, software or off-line data-processing.
2013-04-24
DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and
Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.
Ricci, E; Di Domenico, S; Cianca, E; Rossi, T
2015-01-01
Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.
Artifact detection in electrodermal activity using sparse recovery
NASA Astrophysics Data System (ADS)
Kelsey, Malia; Palumbo, Richard Vincent; Urbaneja, Alberto; Akcakaya, Murat; Huang, Jeannie; Kleckner, Ian R.; Barrett, Lisa Feldman; Quigley, Karen S.; Sejdic, Ervin; Goodwin, Matthew S.
2017-05-01
Electrodermal Activity (EDA) - a peripheral index of sympathetic nervous system activity - is a primary measure used in psychophysiology. EDA is widely accepted as an indicator of physiological arousal, and it has been shown to reveal when psychologically novel events occur. Traditionally, EDA data is collected in controlled laboratory experiments. However, recent developments in wireless biosensing have led to an increase in out-of-lab studies. This transition to ambulatory data collection has introduced challenges. In particular, artifacts such as wearer motion, changes in temperature, and electrical interference can be misidentified as true EDA responses. The inability to distinguish artifact from signal hinders analyses of ambulatory EDA data. Though manual procedures for identifying and removing EDA artifacts exist, they are time consuming - which is problematic for the types of longitudinal data sets represented in modern ambulatory studies. This manuscript presents a novel technique to automatically identify and remove artifacts in EDA data using curve fitting and sparse recovery methods. Our method was evaluated using labeled data to determine the accuracy of artifact identification. Procedures, results, conclusions, and future directions are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, A; Paysan, P; Brehm, M
2016-06-15
Purpose: To improve CBCT image quality for image-guided radiotherapy by applying advanced reconstruction algorithms to overcome scatter, noise, and artifact limitations Methods: CBCT is used extensively for patient setup in radiotherapy. However, image quality generally falls short of diagnostic CT, limiting soft-tissue based positioning and potential applications such as adaptive radiotherapy. The conventional TrueBeam CBCT reconstructor uses a basic scatter correction and FDK reconstruction, resulting in residual scatter artifacts, suboptimal image noise characteristics, and other artifacts like cone-beam artifacts. We have developed an advanced scatter correction that uses a finite-element solver (AcurosCTS) to model the behavior of photons as theymore » pass (and scatter) through the object. Furthermore, iterative reconstruction is applied to the scatter-corrected projections, enforcing data consistency with statistical weighting and applying an edge-preserving image regularizer to reduce image noise. The combined algorithms have been implemented on a GPU. CBCT projections from clinically operating TrueBeam systems have been used to compare image quality between the conventional and improved reconstruction methods. Planning CT images of the same patients have also been compared. Results: The advanced scatter correction removes shading and inhomogeneity artifacts, reducing the scatter artifact from 99.5 HU to 13.7 HU in a typical pelvis case. Iterative reconstruction provides further benefit by reducing image noise and eliminating streak artifacts, thereby improving soft-tissue visualization. In a clinical head and pelvis CBCT, the noise was reduced by 43% and 48%, respectively, with no change in spatial resolution (assessed visually). Additional benefits include reduction of cone-beam artifacts and reduction of metal artifacts due to intrinsic downweighting of corrupted rays. Conclusion: The combination of an advanced scatter correction with iterative reconstruction substantially improves CBCT image quality. It is anticipated that clinically acceptable reconstruction times will result from a multi-GPU implementation (the algorithms are under active development and not yet commercially available). All authors are employees of and (may) own stock of Varian Medical Systems.« less
Reduction of metal artifacts: beam hardening and photon starvation effects
NASA Astrophysics Data System (ADS)
Yadava, Girijesh K.; Pal, Debashish; Hsieh, Jiang
2014-03-01
The presence of metal-artifacts in CT imaging can obscure relevant anatomy and interfere with disease diagnosis. The cause and occurrence of metal-artifacts are primarily due to beam hardening, scatter, partial volume and photon starvation; however, the contribution to the artifacts from each of them depends on the type of hardware. A comparison of CT images obtained with different metallic hardware in various applications, along with acquisition and reconstruction parameters, helps understand methods for reducing or overcoming such artifacts. In this work, a metal beam hardening correction (BHC) and a projection-completion based metal artifact reduction (MAR) algorithms were developed, and applied on phantom and clinical CT scans with various metallic implants. Stainless-steel and Titanium were used to model and correct for metal beam hardening effect. In the MAR algorithm, the corrupted projection samples are replaced by the combination of original projections and in-painted data obtained by forward projecting a prior image. The data included spine fixation screws, hip-implants, dental-filling, and body extremity fixations, covering range of clinically used metal implants. Comparison of BHC and MAR on different metallic implants was used to characterize dominant source of the artifacts, and conceivable methods to overcome those. Results of the study indicate that beam hardening could be a dominant source of artifact in many spine and extremity fixations, whereas dental and hip implants could be dominant source of photon starvation. The BHC algorithm could significantly improve image quality in CT scans with metallic screws, whereas MAR algorithm could alleviate artifacts in hip-implants and dentalfillings.
Putney, Joy; Hilbert, Douglas; Paskaranandavadivel, Niranchan; Cheng, Leo K.; O'Grady, Greg; Angeli, Timothy R.
2016-01-01
Objective The aim of this study was to develop, validate, and apply a fully automated method for reducing large temporally synchronous artifacts present in electrical recordings made from the gastrointestinal (GI) serosa, which are problematic for properly assessing slow wave dynamics. Such artifacts routinely arise in experimental and clinical settings from motion, switching behavior of medical instruments, or electrode array manipulation. Methods A novel iterative COvaraiance-Based Reduction of Artifacts (COBRA) algorithm sequentially reduced artifact waveforms using an updating across-channel median as a noise template, scaled and subtracted from each channel based on their covariance. Results Application of COBRA substantially increased the signal-to-artifact ratio (12.8±2.5 dB), while minimally attenuating the energy of the underlying source signal by 7.9% on average (-11.1±3.9 dB). Conclusion COBRA was shown to be highly effective for aiding recovery and accurate marking of slow wave events (sensitivity = 0.90±0.04; positive-predictive value = 0.74±0.08) from large segments of in vivo porcine GI electrical mapping data that would otherwise be lost due to a broad range of contaminating artifact waveforms. Significance Strongly reducing artifacts with COBRA ultimately allowed for rapid production of accurate isochronal activation maps detailing the dynamics of slow wave propagation in the porcine intestine. Such mapping studies can help characterize differences between normal and dysrhythmic events, which have been associated with GI abnormalities, such as intestinal ischemia and gastroparesis. The COBRA method may be generally applicable for removing temporally synchronous artifacts in other biosignal processing domains. PMID:26829772
WE-AB-207A-12: HLCC Based Quantitative Evaluation Method of Image Artifact in Dental CBCT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Y; Wu, S; Qi, H
Purpose: Image artifacts are usually evaluated qualitatively via visual observation of the reconstructed images, which is susceptible to subjective factors due to the lack of an objective evaluation criterion. In this work, we propose a Helgason-Ludwig consistency condition (HLCC) based evaluation method to quantify the severity level of different image artifacts in dental CBCT. Methods: Our evaluation method consists of four step: 1) Acquire Cone beam CT(CBCT) projection; 2) Convert 3D CBCT projection to fan-beam projection by extracting its central plane projection; 3) Convert fan-beam projection to parallel-beam projection utilizing sinogram-based rebinning algorithm or detail-based rebinning algorithm; 4) Obtain HLCCmore » profile by integrating parallel-beam projection per view and calculate wave percentage and variance of the HLCC profile, which can be used to describe the severity level of image artifacts. Results: Several sets of dental CBCT projections containing only one type of artifact (i.e. geometry, scatter, beam hardening, lag and noise artifact), were simulated using gDRR, a GPU tool developed for efficient, accurate, and realistic simulation of CBCT Projections. These simulated CBCT projections were used to test our proposed method. HLCC profile wave percentage and variance induced by geometry distortion are about 3∼21 times and 16∼393 times as large as that of the artifact-free projection, respectively. The increase factor of wave percentage and variance are 6 and133 times for beam hardening, 19 and 1184 times for scatter, and 4 and16 times for lag artifacts, respectively. In contrast, for noisy projection the wave percentage, variance and inconsistency level are almost the same with those of the noise-free one. Conclusion: We have proposed a quantitative evaluation method of image artifact based on HLCC theory. According to our simulation results, the severity of different artifact types is found to be in a following order: Scatter>Geometry>Beam hardening>Lag>Noise>Artifact-free in dental CBCT.« less
Hoorweg, Anne-Lee J; Pasma, Wietze; van Wolfswinkel, Leo; de Graaff, Jurgen C
2018-02-01
Vital parameter data collected in anesthesia information management systems are often used for clinical research. The validity of this type of research is dependent on the number of artifacts. In this prospective observational cohort study, the incidence of artifacts in anesthesia information management system data was investigated in children undergoing anesthesia for noncardiac procedures. Secondary outcomes included the incidence of artifacts among deviating and nondeviating values, among the anesthesia phases, and among different anesthetic techniques. We included 136 anesthetics representing 10,236 min of anesthesia time. The incidence of artifacts was 0.5% for heart rate (95% CI: 0.4 to 0.7%), 1.3% for oxygen saturation (1.1 to 1.5%), 7.5% for end-tidal carbon dioxide (6.9 to 8.0%), 5.0% for noninvasive blood pressure (4.0 to 6.0%), and 7.3% for invasive blood pressure (5.9 to 8.8%). The incidence of artifacts among deviating values was 3.1% for heart rate (2.1 to 4.4%), 10.8% for oxygen saturation (7.6 to 14.8%), 14.1% for end-tidal carbon dioxide (13.0 to 15.2%), 14.4% for noninvasive blood pressure (10.3 to 19.4%), and 38.4% for invasive blood pressure (30.3 to 47.1%). Not all values in anesthesia information management systems are valid. The incidence of artifacts stored in the present pediatric anesthesia practice was low for heart rate and oxygen saturation, whereas noninvasive and invasive blood pressure and end-tidal carbon dioxide had higher artifact incidences. Deviating values are more often artifacts than values in a normal range, and artifacts are associated with the phase of anesthesia and anesthetic technique. Development of (automatic) data validation systems or solutions to deal with artifacts in data is warranted.
Bellino, Jason C.
2011-01-01
A digital dataset for the Floridan aquifer system in Florida and in parts of Georgia, Alabama, and South Carolina was developed from selected reports published as part of the Regional Aquifer-System Analysis (RASA) Program of the U.S. Geological Survey (USGS) in the 1980s. These reports contain maps and data depicting the extent and elevation of both time-stratigraphic and hydrogeologic units of which the aquifer system is composed, as well as data on hydrology, meteorology, and aquifer properties. The three primary reports used for this dataset compilation were USGS Professional Paper 1403-B (Miller, 1986), Professional Paper 1403-C (Bush and Johnston, 1988), and USGS Open-File Report 88-86 (Miller, 1988). Paper maps from Professional Papers 1403-B and 1403-C were scanned and georeferenced to the North American Datum of 1927 (NAD27) using the Lambert Conformal Conic projection (standard parallels 33 and 45 degrees, central longitude -96 degrees, central latitude 39 degrees). Once georeferenced, tracing of pertinent line features contained in each image (for example, contours and faults) was facilitated by specialized software using algorithms that automated much of the process. Resulting digital line features were then processed using standard geographic information system (GIS) software to remove artifacts from the digitization process and to verify and update attribute tables. The digitization process for polygonal features (for example, outcrop areas and unit extents) was completed by hand using GIS software.
Measurement of luminance and color uniformity of displays using the large-format scanner
NASA Astrophysics Data System (ADS)
Mazikowski, Adam
2017-08-01
Uniformity of display luminance and color is important for comfort and good perception of the information presented on the display. Although display technology has developed and improved a lot over the past years, different types of displays still present a challenge in selected applications, e.g. in medical use or in case of multi-screen installations. A simplified 9-point method of determining uniformity does not always produce satisfactory results, so a different solution is proposed in the paper. The developed system consists of the large-format X-Y-Z ISEL scanner (isel Germany AG), Konica Minolta high sensitivity spot photometer-colorimeter (e.g. CS-200, Konica Minolta, Inc.) and PC computer. Dedicated software in LabView environment for control of the scanner, transfer the measured data to the computer, and visualization of measurement results was also prepared. Based on the developed setup measurements of plasma display and LCD-LED display were performed. A heavily wornout plasma TV unit, with several artifacts visible was selected. These tests show the advantages and drawbacks of described scanning method with comparison with 9-point simplified uniformity determining method.
MARSAME Radiological Release Report for Archaeological Artifacts Excavated from Area L
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruedig, Elizabeth; Whicker, Jeffrey Jay; Gillis, Jessica Mcdonnel
In 1991 Los Alamos National Laboratory’s (LANL’s) cultural resources team excavated archaeological site LA 4618 located at Technical Area 54, within Material Disposal Area L (MDA L). MDA L received non-radioactive chemical waste from the early 1960s until 1985. Further development of the MDA required excavation of several cultural sites under National Historic Preservation Act requirements; artifacts from these sites have been subsequently stored at LANL. The LANL cultural resources group would now like to release these artifacts to the Museum of Indian Arts and Culture in Santa Fe for curation. The history of disposal at Area L suggests thatmore » the artifact pool is unlikely to be chemically contaminated and LANL staff washed each artifact at least once following excavation. Thus, it is unlikely that the artifacts present a chemical hazard. LANL’s Environmental Stewardship group (EPC-ES) has evaluated the radiological survey results for the Area L artifact pool and found that the items described in this report meet the criteria for unrestricted radiological release under Department of Energy (DOE) Order 458.1 Radiation Protection of the Public and the Environment and are candidates for release without restriction from LANL control. This conclusion is based on the known history of MDA L and on radiation survey data.« less
Self-control in postsecondary settings: students' perceptions of ADHD college coaching.
Parker, David R; Hoffman, Sharon Field; Sawilowsky, Shlomo; Rolands, Laura
2013-04-01
The objective of this study was to identify undergraduates' perceptions of the impact of ADHD coaching on their academic success and broader life functioning. One-on-one interviews were conducted with 19 students on 10 different U.S. campuses who comprised a purposive sample of gender, cumulative grade point average, and self-regulation skills variables as measured by the learning and study strategies inventory. Interview transcripts were coded using NVivo 8 software, and emergent themes were triangulated with students' descriptions of personal artifacts that symbolized coaching's influence on their lives. Students reported that ADHD coaching helped them become more self-regulated, which led to positive academic experiences and outcomes. Students described ADHD coaching as a unique service that helped them develop more productive beliefs, experience more positive feelings, and engage in more self-regulated behaviors. ADHD coaching helped participants enhance their self-control as they responded to the multifaceted demands of undergraduate life.
Realizing the Living Paper using the ProvONE Model for Reproducible Research
NASA Astrophysics Data System (ADS)
Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.
2015-12-01
Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The Living Paper provides detailed metadata for properly interpreting and verifying individual research findings, for tracing the origin of ideas, for launching new lines of inquiry, and for implementing transitive credit for research and engineering.
Taxonomy development and knowledge representation of nurses' personal cognitive artifacts.
McLane, Sharon; Turley, James P
2009-11-14
Nurses prepare knowledge representations, or summaries of patient clinical data, each shift. These knowledge representations serve multiple purposes, including support of working memory, workload organization and prioritization, critical thinking, and reflection. This summary is integral to internal knowledge representations, working memory, and decision-making. Study of this nurse knowledge representation resulted in development of a taxonomy of knowledge representations necessary to nursing practice.This paper describes the methods used to elicit the knowledge representations and structures necessary for the work of clinical nurses, described the development of a taxonomy of this knowledge representation, and discusses translation of this methodology to the cognitive artifacts of other disciplines. Understanding the development and purpose of practitioner's knowledge representations provides important direction to informaticists seeking to create information technology alternatives. The outcome of this paper is to suggest a process template for transition of cognitive artifacts to an information system.
A Community-Driven Workflow Recommendations and Reuse Infrastructure
NASA Astrophysics Data System (ADS)
Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.
2013-12-01
Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from usage history to help Earth scientists better understand existing artifacts and how to use them in a proper manner? R2: Informed by insights derived from their computing contexts, how could such hidden knowledge be used to facilitate artifact reuse by Earth scientists? Our study of the two research questions will provide answers to three technical questions aiming to assist NEX users during workflow development: 1) How to determine what topics interest the researcher? 2) How to find appropriate artifacts? and 3) How to advise the researcher in artifact reuse? In this paper, we report our on-going efforts of leveraging social networking theory and analysis techniques to provide dynamic advice on artifact reuse to NEX users based on their surrounding contexts. As a proof of concept, we have designed and developed a plug-in to the VisTrails workflow design tool. When users develop workflows using VisTrails, our plug-in will proactively recommend most relevant sub-workflows to the users.
Autonomous system for Web-based microarray image analysis.
Bozinov, Daniel
2003-12-01
Software-based feature extraction from DNA microarray images still requires human intervention on various levels. Manual adjustment of grid and metagrid parameters, precise alignment of superimposed grid templates and gene spots, or simply identification of large-scale artifacts have to be performed beforehand to reliably analyze DNA signals and correctly quantify their expression values. Ideally, a Web-based system with input solely confined to a single microarray image and a data table as output containing measurements for all gene spots would directly transform raw image data into abstracted gene expression tables. Sophisticated algorithms with advanced procedures for iterative correction function can overcome imminent challenges in image processing. Herein is introduced an integrated software system with a Java-based interface on the client side that allows for decentralized access and furthermore enables the scientist to instantly employ the most updated software version at any given time. This software tool is extended from PixClust as used in Extractiff incorporated with Java Web Start deployment technology. Ultimately, this setup is destined for high-throughput pipelines in genome-wide medical diagnostics labs or microarray core facilities aimed at providing fully automated service to its users.
Causes of Ultrasound Doppler Twinkling Artifact
NASA Astrophysics Data System (ADS)
Leonov, D. V.; Kulberg, N. S.; Gromov, A. I.; Morozov, S. P.; Kim, S. Yu.
2018-01-01
Ultrasound Doppler twinkling artifact is analyzed. It usually appears as a frequent color alteration in the region of hyperechoic objects. Its noiselike spectrum can also be seen in spectral Doppler mode. Physicians use twinkling artifact as a clinical sign for kidney-stone and soft-tissue calculi detection. The advantageous peculiarity of this study is that the experiments were conducted utilizing raw signals obtained from a custom ultrasonic machine and a specially developed phantom. The phantom contained specimens with known qualities, allowing for reproducible and predictable results. The experiments revealed evidence for two physical causes of twinkling artifact, which were associated with two unique Doppler signals. The research laid the foundation for the new reflected-signal model introduced and used throughout this paper.
A Framework to Manage Information Models
NASA Astrophysics Data System (ADS)
Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.
2008-05-01
The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.
A longitudinal dataset of five years of public activity in the Scratch online community.
Hill, Benjamin Mako; Monroy-Hernández, Andrés
2017-01-31
Scratch is a programming environment and an online community where young people can create, share, learn, and communicate. In collaboration with the Scratch Team at MIT, we created a longitudinal dataset of public activity in the Scratch online community during its first five years (2007-2012). The dataset comprises 32 tables with information on more than 1 million Scratch users, nearly 2 million Scratch projects, more than 10 million comments, more than 30 million visits to Scratch projects, and more. To help researchers understand this dataset, and to establish the validity of the data, we also include the source code of every version of the software that operated the website, as well as the software used to generate this dataset. We believe this is the largest and most comprehensive downloadable dataset of youth programming artifacts and communication.
A steep peripheral ring in irregular cornea topography, real or an instrument error?
Galindo-Ferreiro, Alicia; Galvez-Ruiz, Alberto; Schellini, Silvana A; Galindo-Alonso, Julio
2016-01-01
To demonstrate that the steep peripheral ring (red zone) on corneal topography after myopic laser in situ keratomileusis (LASIK) could possibly due to instrument error and not always to a real increase in corneal curvature. A spherical model for the corneal surface and modifying topography software was used to analyze the cause of an error due to instrument design. This study involved modification of the software of a commercially available topographer. A small modification of the topography image results in a red zone on the corneal topography color map. Corneal modeling indicates that the red zone could be an artifact due to an instrument-induced error. The steep curvature changes after LASIK, signified by the red zone, could be also an error due to the plotting algorithms of the corneal topographer, besides a steep curvature change.
Methods for artifact detection and removal from scalp EEG: A review.
Islam, Md Kafiul; Rastegarnia, Amir; Yang, Zhi
2016-11-01
Electroencephalography (EEG) is the most popular brain activity recording technique used in wide range of applications. One of the commonly faced problems in EEG recordings is the presence of artifacts that come from sources other than brain and contaminate the acquired signals significantly. Therefore, much research over the past 15 years has focused on identifying ways for handling such artifacts in the preprocessing stage. However, this is still an active area of research as no single existing artifact detection/removal method is complete or universal. This article presents an extensive review of the existing state-of-the-art artifact detection and removal methods from scalp EEG for all potential EEG-based applications and analyses the pros and cons of each method. First, a general overview of the different artifact types that are found in scalp EEG and their effect on particular applications are presented. In addition, the methods are compared based on their ability to remove certain types of artifacts and their suitability in relevant applications (only functional comparison is provided not performance evaluation of methods). Finally, the future direction and expected challenges of current research is discussed. Therefore, this review is expected to be helpful for interested researchers who will develop and/or apply artifact handling algorithm/technique in future for their applications as well as for those willing to improve the existing algorithms or propose a new solution in this particular area of research. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Image compression software for the SOHO LASCO and EIT experiments
NASA Technical Reports Server (NTRS)
Grunes, Mitchell R.; Howard, Russell A.; Hoppel, Karl; Mango, Stephen A.; Wang, Dennis
1994-01-01
This paper describes the lossless and lossy image compression algorithms to be used on board the Solar Heliospheric Observatory (SOHO) in conjunction with the Large Angle Spectrometric Coronograph and Extreme Ultraviolet Imaging Telescope experiments. It also shows preliminary results obtained using similar prior imagery and discusses the lossy compression artifacts which will result. This paper is in part intended for the use of SOHO investigators who need to understand the results of SOHO compression in order to better allocate the transmission bits which they have been allocated.
Potvin, Brigitte M; Shourijeh, Mohammad S; Smale, Kenneth B; Benoit, Daniel L
2017-09-06
Musculoskeletal modeling and simulations have vast potential in clinical and research fields, but face various challenges in representing the complexities of the human body. Soft tissue artifact from skin-mounted markers may lead to non-physiological representation of joint motions being used as inputs to models in simulations. To address this, we have developed adaptive joint constraints on five of the six degree of freedom of the knee joint based on in vivo tibiofemoral joint motions recorded during walking, hopping and cutting motions from subjects instrumented with intra-cortical pins inserted into their tibia and femur. The constraint boundaries vary as a function of knee flexion angle and were tested on four whole-body models including four to six knee degrees of freedom. A musculoskeletal model developed in OpenSim simulation software was constrained to these in vivo boundaries during level gait and inverse kinematics and dynamics were then resolved. Statistical parametric mapping indicated significant differences (p<0.05) in kinematics between bone pin constrained and unconstrained model conditions, notably in knee translations, while hip and ankle flexion/extension angles were also affected, indicating the error at the knee propagates to surrounding joints. These changes to hip, knee, and ankle kinematics led to measurable changes in hip and knee transverse plane moments, and knee frontal plane moments and forces. Since knee flexion angle can be validly represented using skin mounted markers, our tool uses this reliable measure to guide the five other degrees of freedom at the knee and provide a more valid representation of the kinematics for these degrees of freedom. Copyright © 2017 Elsevier Ltd. All rights reserved.
CP-CHARM: segmentation-free image classification made accessible.
Uhlmann, Virginie; Singh, Shantanu; Carpenter, Anne E
2016-01-27
Automated classification using machine learning often relies on features derived from segmenting individual objects, which can be difficult to automate. WND-CHARM is a previously developed classification algorithm in which features are computed on the whole image, thereby avoiding the need for segmentation. The algorithm obtained encouraging results but requires considerable computational expertise to execute. Furthermore, some benchmark sets have been shown to be subject to confounding artifacts that overestimate classification accuracy. We developed CP-CHARM, a user-friendly image-based classification algorithm inspired by WND-CHARM in (i) its ability to capture a wide variety of morphological aspects of the image, and (ii) the absence of requirement for segmentation. In order to make such an image-based classification method easily accessible to the biological research community, CP-CHARM relies on the widely-used open-source image analysis software CellProfiler for feature extraction. To validate our method, we reproduced WND-CHARM's results and ensured that CP-CHARM obtained comparable performance. We then successfully applied our approach on cell-based assay data and on tissue images. We designed these new training and test sets to reduce the effect of batch-related artifacts. The proposed method preserves the strengths of WND-CHARM - it extracts a wide variety of morphological features directly on whole images thereby avoiding the need for cell segmentation, but additionally, it makes the methods easily accessible for researchers without computational expertise by implementing them as a CellProfiler pipeline. It has been demonstrated to perform well on a wide range of bioimage classification problems, including on new datasets that have been carefully selected and annotated to minimize batch effects. This provides for the first time a realistic and reliable assessment of the whole image classification strategy.
NASA Astrophysics Data System (ADS)
Schramm, G.; Maus, J.; Hofheinz, F.; Petr, J.; Lougovski, A.; Beuthien-Baumann, B.; Platzek, I.; van den Hoff, J.
2014-06-01
The aim of this paper is to describe a new automatic method for compensation of metal-implant-induced segmentation errors in MR-based attenuation maps (MRMaps) and to evaluate the quantitative influence of those artifacts on the reconstructed PET activity concentration. The developed method uses a PET-based delineation of the patient contour to compensate metal-implant-caused signal voids in the MR scan that is segmented for PET attenuation correction. PET emission data of 13 patients with metal implants examined in a Philips Ingenuity PET/MR were reconstructed with the vendor-provided method for attenuation correction (MRMaporig, PETorig) and additionally with a method for attenuation correction (MRMapcor, PETcor) developed by our group. MRMaps produced by both methods were visually inspected for segmentation errors. The segmentation errors in MRMaporig were classified into four classes (L1 and L2 artifacts inside the lung and B1 and B2 artifacts inside the remaining body depending on the assigned attenuation coefficients). The average relative SUV differences (\\varepsilon _{rel}^{av}) between PETorig and PETcor of all regions showing wrong attenuation coefficients in MRMaporig were calculated. Additionally, relative SUVmean differences (ɛrel) of tracer accumulations in hot focal structures inside or in the vicinity of these regions were evaluated. MRMaporig showed erroneous attenuation coefficients inside the regions affected by metal artifacts and inside the patients' lung in all 13 cases. In MRMapcor, all regions with metal artifacts, except for the sternum, were filled with the soft-tissue attenuation coefficient and the lung was correctly segmented in all patients. MRMapcor only showed small residual segmentation errors in eight patients. \\varepsilon _{rel}^{av} (mean ± standard deviation) were: ( - 56 ± 3)% for B1, ( - 43 ± 4)% for B2, (21 ± 18)% for L1, (120 ± 47)% for L2 regions. ɛrel (mean ± standard deviation) of hot focal structures were: ( - 52 ± 12)% in B1, ( - 45 ± 13)% in B2, (19 ± 19)% in L1, (51 ± 31)% in L2 regions. Consequently, metal-implant-induced artifacts severely disturb MR-based attenuation correction and SUV quantification in PET/MR. The developed algorithm is able to compensate for these artifacts and improves SUV quantification accuracy distinctly.
GPU-Based Simulation of Ultrasound Imaging Artifacts for Cryosurgery Training.
Keelan, Robert; Shimada, Kenji; Rabin, Yoed
2017-02-01
This study presents an efficient computational technique for the simulation of ultrasound imaging artifacts associated with cryosurgery based on nonlinear ray tracing. This study is part of an ongoing effort to develop computerized training tools for cryosurgery, with prostate cryosurgery as a development model. The capability of performing virtual cryosurgical procedures on a variety of test cases is essential for effective surgical training. Simulated ultrasound imaging artifacts include reverberation and reflection of the cryoprobes in the unfrozen tissue, reflections caused by the freezing front, shadowing caused by the frozen region, and tissue property changes in repeated freeze-thaw cycles procedures. The simulated artifacts appear to preserve the key features observed in a clinical setting. This study displays an example of how training may benefit from toggling between the undisturbed ultrasound image, the simulated temperature field, the simulated imaging artifacts, and an augmented hybrid presentation of the temperature field superimposed on the ultrasound image. The proposed method is demonstrated on a graphic processing unit at 100 frames per second, on a mid-range personal workstation, at two orders of magnitude faster than a typical cryoprocedure. This performance is based on computation with C++ accelerated massive parallelism and its interoperability with the DirectX-rendering application programming interface.
GPU-Based Simulation of Ultrasound Imaging Artifacts for Cryosurgery Training
Keelan, Robert; Shimada, Kenji
2016-01-01
This study presents an efficient computational technique for the simulation of ultrasound imaging artifacts associated with cryosurgery based on nonlinear ray tracing. This study is part of an ongoing effort to develop computerized training tools for cryosurgery, with prostate cryosurgery as a development model. The capability of performing virtual cryosurgical procedures on a variety of test cases is essential for effective surgical training. Simulated ultrasound imaging artifacts include reverberation and reflection of the cryoprobes in the unfrozen tissue, reflections caused by the freezing front, shadowing caused by the frozen region, and tissue property changes in repeated freeze–thaw cycles procedures. The simulated artifacts appear to preserve the key features observed in a clinical setting. This study displays an example of how training may benefit from toggling between the undisturbed ultrasound image, the simulated temperature field, the simulated imaging artifacts, and an augmented hybrid presentation of the temperature field superimposed on the ultrasound image. The proposed method is demonstrated on a graphic processing unit at 100 frames per second, on a mid-range personal workstation, at two orders of magnitude faster than a typical cryoprocedure. This performance is based on computation with C++ accelerated massive parallelism and its interoperability with the DirectX-rendering application programming interface. PMID:26818026
Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.
Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D
2015-01-01
Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.
EEG Artifact Removal Using a Wavelet Neural Network
NASA Technical Reports Server (NTRS)
Nguyen, Hoang-Anh T.; Musson, John; Li, Jiang; McKenzie, Frederick; Zhang, Guangfan; Xu, Roger; Richey, Carl; Schnell, Tom
2011-01-01
!n this paper we developed a wavelet neural network. (WNN) algorithm for Electroencephalogram (EEG) artifact removal without electrooculographic (EOG) recordings. The algorithm combines the universal approximation characteristics of neural network and the time/frequency property of wavelet. We. compared the WNN algorithm with .the ICA technique ,and a wavelet thresholding method, which was realized by using the Stein's unbiased risk estimate (SURE) with an adaptive gradient-based optimal threshold. Experimental results on a driving test data set show that WNN can remove EEG artifacts effectively without diminishing useful EEG information even for very noisy data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Peter C.; Schreibmann, Eduard; Roper, Justin
2015-03-15
Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR.more » Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.« less
NASA Astrophysics Data System (ADS)
Yang, Fanlin; Zhao, Chunxia; Zhang, Kai; Feng, Chengkai; Ma, Yue
2017-07-01
Acoustic seafloor classification with multibeam backscatter measurements is an attractive approach for mapping seafloor properties over a large area. However, artifacts in the multibeam backscatter measurements prevent accurate characterization of the seafloor. In particular, the backscatter level is extremely strong and highly variable in the near-nadir region due to the specular echo phenomenon. Consequently, striped artifacts emerge in the backscatter image, which can degrade the classification accuracy. This study focuses on the striped artifacts in multibeam backscatter images. To this end, a calibration algorithm based on equal mean-variance fitting is developed. By fitting the local shape of the angular response curve, the striped artifacts are compressed and moved according to the relations between the mean and variance in the near-nadir and off-nadir region. The algorithm utilized the measured data of near-nadir region and retained the basic shape of the response curve. The experimental results verify the high performance of the proposed method.
Artifact Reduction in X-Ray CT Images of Al-Steel-Perspex Specimens Mimicking a Hip Prosthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madhogarhia, Manish; Munshi, P.; Lukose, Sijo
2008-09-26
X-ray Computed Tomography (CT) is a relatively new technique developed in the late 1970's, which enables the nondestructive visualization of the internal structure of objects. Beam hardening caused by the polychromatic spectrum is an important problem in X-ray computed tomography (X-CT). It leads to various artifacts in reconstruction images and reduces image quality. In the present work we are considering the Artifact Reduction in Total Hip Prosthesis CT Scan which is a problem of medical imaging. We are trying to reduce the cupping artifact induced by beam hardening as well as metal artifact as they exist in the CT scanmore » of a human hip after the femur is replaced by a metal implant. The correction method for beam hardening used here is based on a previous work. Simulation study for the present problem includes a phantom consisting of mild steel, aluminium and perspex mimicking the photon attenuation properties of a hum hip cross section with metal implant.« less
NASA Astrophysics Data System (ADS)
Rotenberg, David J.
Artifacts caused by head motion are a substantial source of error in fMRI that limits its use in neuroscience research and clinical settings. Real-time scan-plane correction by optical tracking has been shown to correct slice misalignment and non-linear spin-history artifacts, however residual artifacts due to dynamic magnetic field non-uniformity may remain in the data. A recently developed correction technique, PLACE, can correct for absolute geometric distortion using the complex image data from two EPI images, with slightly shifted k-space trajectories. We present a correction approach that integrates PLACE into a real-time scan-plane update system by optical tracking, applied to a tissue-equivalent phantom undergoing complex motion and an fMRI finger tapping experiment with overt head motion to induce dynamic field non-uniformity. Experiments suggest that including volume by volume geometric distortion correction by PLACE can suppress dynamic geometric distortion artifacts in a phantom and in vivo and provide more robust activation maps.
Correction of data truncation artifacts in differential phase contrast (DPC) tomosynthesis imaging
NASA Astrophysics Data System (ADS)
Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong
2015-10-01
The use of grating based Talbot-Lau interferometry permits the acquisition of differential phase contrast (DPC) imaging with a conventional medical x-ray source and detector. However, due to the limited area of the gratings, limited area of the detector, or both, data truncation image artifacts are often observed in tomographic DPC acquisitions and reconstructions, such as tomosynthesis (limited-angle tomography). When data are truncated in the conventional x-ray absorption tomosynthesis imaging, a variety of methods have been developed to mitigate the truncation artifacts. However, the same strategies used to mitigate absorption truncation artifacts do not yield satisfactory reconstruction results in DPC tomosynthesis reconstruction. In this work, several new methods have been proposed to mitigate data truncation artifacts in a DPC tomosynthesis system. The proposed methods have been validated using experimental data of a mammography accreditation phantom, a bovine udder, as well as several human cadaver breast specimens using a bench-top DPC imaging system at our facility.
NASA Astrophysics Data System (ADS)
Hu, Xiao-Su; Arredondo, Maria M.; Gomba, Megan; Confer, Nicole; DaSilva, Alexandre F.; Johnson, Timothy D.; Shalinsky, Mark; Kovelman, Ioulia
2015-12-01
Motion artifacts are the most significant sources of noise in the context of pediatric brain imaging designs and data analyses, especially in applications of functional near-infrared spectroscopy (fNIRS), in which it can completely affect the quality of the data acquired. Different methods have been developed to correct motion artifacts in fNIRS data, but the relative effectiveness of these methods for data from child and infant subjects (which is often found to be significantly noisier than adult data) remains largely unexplored. The issue is further complicated by the heterogeneity of fNIRS data artifacts. We compared the efficacy of the six most prevalent motion artifact correction techniques with fNIRS data acquired from children participating in a language acquisition task, including wavelet, spline interpolation, principal component analysis, moving average (MA), correlation-based signal improvement, and combination of wavelet and MA. The evaluation of five predefined metrics suggests that the MA and wavelet methods yield the best outcomes. These findings elucidate the varied nature of fNIRS data artifacts and the efficacy of artifact correction methods with pediatric populations, as well as help inform both the theory and practice of optical brain imaging analysis.
Motion artifacts in MRI: A complex problem with many partial solutions.
Zaitsev, Maxim; Maclaren, Julian; Herbst, Michael
2015-10-01
Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artifacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artifacts, but no single method can be applied in all imaging situations. Instead, a "toolbox" of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artifacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artifacts, with the aim of aiding artifact detection and mitigation in particular clinical situations. © 2015 Wiley Periodicals, Inc.
Achieving Consistent Doppler Measurements from SDO/HMI Vector Field Inversions
NASA Technical Reports Server (NTRS)
Schuck, Peter W.; Antiochos, S. K.; Leka, K. D.; Barnes, Graham
2016-01-01
NASA's Solar Dynamics Observatory is delivering vector magnetic field observations of the full solar disk with unprecedented temporal and spatial resolution; however, the satellite is in a highly inclined geosynchronous orbit. The relative spacecraft-Sun velocity varies by +/-3 kms-1 over a day, which introduces major orbital artifacts in the Helioseismic Magnetic Imager (HMI) data. We demonstrate that the orbital artifacts contaminate all spatial and temporal scales in the data. We describe a newly developed three-stage procedure for mitigating these artifacts in the Doppler data obtained from the Milne-Eddington inversions in the HMI pipeline. The procedure ultimately uses 32 velocity-dependent coefficients to adjust 10 million pixels-a remarkably sparse correction model given the complexity of the orbital artifacts. This procedure was applied to full-disk images of AR 11084 to produce consistent Dopplergrams. The data adjustments reduce the power in the orbital artifacts by 31 dB. Furthermore, we analyze in detail the corrected images and show that our procedure greatly improves the temporal and spectral properties of the data without adding any new artifacts. We conclude that this new procedure makes a dramatic improvement in the consistency of the HMI data and in its usefulness for precision scientific studies.
TU-H-206-01: An Automated Approach for Identifying Geometric Distortions in Gamma Cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, S; Nelson, J; Samei, E
2016-06-15
Purpose: To develop a clinically-deployable, automated process for detecting artifacts in routine nuclear medicine (NM) quality assurance (QA) bar phantom images. Methods: An artifact detection algorithm was created to analyze bar phantom images as part of an ongoing QA program. A low noise, high resolution reference image was acquired from an x-ray of the bar phantom with a Philips Digital Diagnost system utilizing image stitching. NM bar images, acquired for 5 million counts over a 512×512 matrix, were registered to the template image by maximizing mutual information (MI). The MI index was used as an initial test for artifacts; lowmore » values indicate an overall presence of distortions regardless of their spatial location. Images with low MI scores were further analyzed for bar linearity, periodicity, alignment, and compression to locate differences with respect to the template. Findings from each test were spatially correlated and locations failing multiple tests were flagged as potential artifacts requiring additional visual analysis. The algorithm was initially deployed for GE Discovery 670 and Infinia Hawkeye gamma cameras. Results: The algorithm successfully identified clinically relevant artifacts from both systems previously unnoticed by technologists performing the QA. Average MI indices for artifact-free images are 0.55. Images with MI indices < 0.50 have shown 100% sensitivity and specificity for artifact detection when compared with a thorough visual analysis. Correlation of geometric tests confirms the ability to spatially locate the most likely image regions containing an artifact regardless of initial phantom orientation. Conclusion: The algorithm shows the potential to detect gamma camera artifacts that may be missed by routine technologist inspections. Detection and subsequent correction of artifacts ensures maximum image quality and may help to identify failing hardware before it impacts clinical workflow. Going forward, the algorithm is being deployed to monitor data from all gamma cameras within our health system.« less
SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, C; Qi, H; Chen, Z
Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using meanmore » filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.« less
Motion artifact removal in FNIR spectroscopy for real-world applications
NASA Astrophysics Data System (ADS)
Devaraj, Ajit; Izzetoglu, Meltem; Izzetoglu, Kurtulus; Bunce, Scott C.; Li, Connie Y.; Onaral, Banu
2004-12-01
Near infrared spectroscopy as a neuroimaging modality is a recent development. Near infrared neuroimagers are typically safe, portable, relatively affordable and non-invasive. The ease of sensor setup and non-intrusiveness make functional near infrared (fNIR) imaging an ideal candidate for monitoring human cortical function in a wide range of real world situations. However optical signals are susceptible to motion-artifacts, hindering the application of fNIR in studies where subject mobility cannot be controlled. In this paper, we present a filtering framework for motion-artifact cancellation to facilitate the deployment of fNIR imaging in real-world scenarios. We simulate a generic field environment by having subjects walk on a treadmill while performing a cognitive task and demonstrate that measurements can be effectively cleaned of motion-artifacts.
Effects of Spatio-Temporal Aliasing on Out-the-Window Visual Systems
NASA Technical Reports Server (NTRS)
Sweet, Barbara T.; Stone, Leland S.; Liston, Dorion B.; Hebert, Tim M.
2014-01-01
Designers of out-the-window visual systems face a challenge when attempting to simulate the outside world as viewed from a cockpit. Many methodologies have been developed and adopted to aid in the depiction of particular scene features, or levels of static image detail. However, because aircraft move, it is necessary to also consider the quality of the motion in the simulated visual scene. When motion is introduced in the simulated visual scene, perceptual artifacts can become apparent. A particular artifact related to image motion, spatiotemporal aliasing, will be addressed. The causes of spatio-temporal aliasing will be discussed, and current knowledge regarding the impact of these artifacts on both motion perception and simulator task performance will be reviewed. Methods of reducing the impact of this artifact are also addressed
Ryali, S; Glover, GH; Chang, C; Menon, V
2009-01-01
EEG data acquired in an MRI scanner are heavily contaminated by gradient artifacts that can significantly compromise signal quality. We developed two new methods based on Independent Component Analysis (ICA) for reducing gradient artifacts from spiral in-out and echo-planar pulse sequences at 3T, and compared our algorithms with four other commonly used methods: average artifact subtraction (Allen et al. 2000), principal component analysis (Niazy et al. 2005), Taylor series (Wan et al. 2006) and a conventional temporal ICA algorithm. Models of gradient artifacts were derived from simulations as well as a water phantom and performance of each method was evaluated on datasets constructed using visual event-related potentials (ERPs) as well as resting EEG. Our new methods recovered ERPs and resting EEG below the beta band (< 12.5 Hz) with high signal-to-noise ratio (SNR > 4). Our algorithms outperformed all of these methods on resting EEG in the theta- and alpha-bands (SNR > 4); however, for all methods, signal recovery was modest (SNR ~ 1) in the beta-band and poor (SNR < 0.3) in the gamma-band and above. We found that the conventional ICA algorithm performed poorly with uniformly low SNR (< 0.1). Taken together, our new ICA-based methods offer a more robust technique for gradient artifact reduction when scanning at 3T using spiral in-out and echo-planar pulse sequences. We provide new insights into the strengths and weaknesses of each method using a unified subspace framework. PMID:19580873
Rivera-Rivera, Carlos J; Montoya-Burgos, Juan I
2016-06-01
Phylogenetic inference artifacts can occur when sequence evolution deviates from assumptions made by the models used to analyze them. The combination of strong model assumption violations and highly heterogeneous lineage evolutionary rates can become problematic in phylogenetic inference, and lead to the well-described long-branch attraction (LBA) artifact. Here, we define an objective criterion for assessing lineage evolutionary rate heterogeneity among predefined lineages: the result of a likelihood ratio test between a model in which the lineages evolve at the same rate (homogeneous model) and a model in which different lineage rates are allowed (heterogeneous model). We implement this criterion in the algorithm Locus Specific Sequence Subsampling (LS³), aimed at reducing the effects of LBA in multi-gene datasets. For each gene, LS³ sequentially removes the fastest-evolving taxon of the ingroup and tests for lineage rate homogeneity until all lineages have uniform evolutionary rates. The sequences excluded from the homogeneously evolving taxon subset are flagged as potentially problematic. The software implementation provides the user with the possibility to remove the flagged sequences for generating a new concatenated alignment. We tested LS³ with simulations and two real datasets containing LBA artifacts: a nucleotide dataset regarding the position of Glires within mammals and an amino-acid dataset concerning the position of nematodes within bilaterians. The initially incorrect phylogenies were corrected in all cases upon removing data flagged by LS³. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Evaluation of motion artifact metrics for coronary CT angiography.
Ma, Hongfeng; Gros, Eric; Szabo, Aniko; Baginski, Scott G; Laste, Zachary R; Kulkarni, Naveen M; Okerlund, Darin; Schmidt, Taly G
2018-02-01
This study quantified the performance of coronary artery motion artifact metrics relative to human observer ratings. Motion artifact metrics have been used as part of motion correction and best-phase selection algorithms for Coronary Computed Tomography Angiography (CCTA). However, the lack of ground truth makes it difficult to validate how well the metrics quantify the level of motion artifact. This study investigated five motion artifact metrics, including two novel metrics, using a dynamic phantom, clinical CCTA images, and an observer study that provided ground-truth motion artifact scores from a series of pairwise comparisons. Five motion artifact metrics were calculated for the coronary artery regions on both phantom and clinical CCTA images: positivity, entropy, normalized circularity, Fold Overlap Ratio (FOR), and Low-Intensity Region Score (LIRS). CT images were acquired of a dynamic cardiac phantom that simulated cardiac motion and contained six iodine-filled vessels of varying diameter and with regions of soft plaque and calcifications. Scans were repeated with different gantry start angles. Images were reconstructed at five phases of the motion cycle. Clinical images were acquired from 14 CCTA exams with patient heart rates ranging from 52 to 82 bpm. The vessel and shading artifacts were manually segmented by three readers and combined to create ground-truth artifact regions. Motion artifact levels were also assessed by readers using a pairwise comparison method to establish a ground-truth reader score. The Kendall's Tau coefficients were calculated to evaluate the statistical agreement in ranking between the motion artifacts metrics and reader scores. Linear regression between the reader scores and the metrics was also performed. On phantom images, the Kendall's Tau coefficients of the five motion artifact metrics were 0.50 (normalized circularity), 0.35 (entropy), 0.82 (positivity), 0.77 (FOR), 0.77(LIRS), where higher Kendall's Tau signifies higher agreement. The FOR, LIRS, and transformed positivity (the fourth root of the positivity) were further evaluated in the study of clinical images. The Kendall's Tau coefficients of the selected metrics were 0.59 (FOR), 0.53 (LIRS), and 0.21 (Transformed positivity). In the study of clinical data, a Motion Artifact Score, defined as the product of FOR and LIRS metrics, further improved agreement with reader scores, with a Kendall's Tau coefficient of 0.65. The metrics of FOR, LIRS, and the product of the two metrics provided the highest agreement in motion artifact ranking when compared to the readers, and the highest linear correlation to the reader scores. The validated motion artifact metrics may be useful for developing and evaluating methods to reduce motion in Coronary Computed Tomography Angiography (CCTA) images. © 2017 American Association of Physicists in Medicine.
Stimulation artifact correction method for estimation of early cortico-cortical evoked potentials.
Trebaul, Lena; Rudrauf, David; Job, Anne-Sophie; Mălîia, Mihai Dragos; Popa, Irina; Barborica, Andrei; Minotti, Lorella; Mîndruţă, Ioana; Kahane, Philippe; David, Olivier
2016-05-01
Effective connectivity can be explored using direct electrical stimulations in patients suffering from drug-resistant focal epilepsies and investigated with intracranial electrodes. Responses to brief electrical pulses mimic the physiological propagation of signals and manifest as cortico-cortical evoked potentials (CCEP). The first CCEP component is believed to reflect direct connectivity with the stimulated region but the stimulation artifact, a sharp deflection occurring during a few milliseconds, frequently contaminates it. In order to recover the characteristics of early CCEP responses, we developed an artifact correction method based on electrical modeling of the electrode-tissue interface. The biophysically motivated artifact templates are then regressed out of the recorded data as in any classical template-matching removal artifact methods. Our approach is able to make the distinction between the physiological responses time-locked to the stimulation pulses and the non-physiological component. We tested the correction on simulated CCEP data in order to quantify its efficiency for different stimulation and recording parameters. We demonstrated the efficiency of the new correction method on simulations of single trial recordings for early responses contaminated with the stimulation artifact. The results highlight the importance of sampling frequency for an accurate analysis of CCEP. We then applied the approach to experimental data. The model-based template removal was compared to a correction based on the subtraction of the averaged artifact. This new correction method of stimulation artifact will enable investigators to better analyze early CCEP components and infer direct effective connectivity in future CCEP studies. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S
2013-11-09
In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.
2013-01-01
Background In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. Results FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230–239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720–737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. Conclusion The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches. PMID:24206927
Bolstad, Kirsten; Flatabø, Silje; Aadnevik, Daniel; Dalehaug, Ingvild; Vetti, Nils
2018-01-01
Background Metal implants may introduce severe artifacts in computed tomography (CT) images. Over the last few years dedicated algorithms have been developed in order to reduce metal artifacts in CT images. Purpose To investigate and compare metal artifact reduction algorithms (MARs) from four different CT vendors when imaging three different orthopedic metal implants. Material and Methods Three clinical metal implants were attached to the leg of an anthropomorphic phantom: cobalt-chrome; stainless steel; and titanium. Four commercial MARs were investigated: SmartMAR (GE); O-MAR (Philips); iMAR (Siemens); and SEMAR (Toshiba). The images were evaluated subjectively by three observers and analyzed objectively by calculating the fraction of pixels with CT number above 500 HU in a region of interest around the metal. The average CT number and image noise were also measured. Results Both subjective evaluation and objective analysis showed that MARs reduced metal artifacts and improved the image quality for CT images containing metal implants of steel and cobalt-chrome. When using MARs on titanium, all MARs introduced new visible artifacts. Conclusion The effect of MARs varied between CT vendors and different metal implants used in orthopedic surgery. Both in subjective evaluation and objective analysis the effect of applying MARs was most obvious on steel and cobalt-chrome implants when using SEMAR from Toshiba followed by SmartMAR from GE. However, MARs may also introduce new image artifacts especially when used on titanium implants. Therefore, it is important to reconstruct all CT images containing metal with and without MARs.
Development of tools and techniques for monitoring underwater artifacts
NASA Astrophysics Data System (ADS)
Lazar, Iulian; Ghilezan, Alin; Hnatiuc, Mihaela
2016-12-01
The different assessments provide information on the best methods to approach an artifact. The presence and extent of potential threats to archaeology must also be determined. In this paper we present an underwater robot, built in the laboratory, able to identify the artifact and to get it to the surface. It is an underwater remotely operated vehicle (ROV) which can be controlled remotely from the shore, a boat or a control station and communication is possible through an Ethernet cable with a maximum length of 100 m. The robot is equipped with an IP camera which sends real time images that can be accessed anywhere from within the network. The camera also has a microSD card to store the video. The methods developed for data communication between the robot and the user is present. A communication protocol between the client and server is developed to control the ROV.
Developing a denoising filter for electron microscopy and tomography data in the cloud.
Starosolski, Zbigniew; Szczepanski, Marek; Wahle, Manuel; Rusu, Mirabela; Wriggers, Willy
2012-09-01
The low radiation conditions and the predominantly phase-object image formation of cryo-electron microscopy (cryo-EM) result in extremely high noise levels and low contrast in the recorded micrographs. The process of single particle or tomographic 3D reconstruction does not completely eliminate this noise and is even capable of introducing new sources of noise during alignment or when correcting for instrument parameters. The recently developed Digital Paths Supervised Variance (DPSV) denoising filter uses local variance information to control regional noise in a robust and adaptive manner. The performance of the DPSV filter was evaluated in this review qualitatively and quantitatively using simulated and experimental data from cryo-EM and tomography in two and three dimensions. We also assessed the benefit of filtering experimental reconstructions for visualization purposes and for enhancing the accuracy of feature detection. The DPSV filter eliminates high-frequency noise artifacts (density gaps), which would normally preclude the accurate segmentation of tomography reconstructions or the detection of alpha-helices in single-particle reconstructions. This collaborative software development project was carried out entirely by virtual interactions among the authors using publicly available development and file sharing tools.
BRIDG: a domain information model for translational and clinical protocol-driven research.
Becnel, Lauren B; Hastak, Smita; Ver Hoef, Wendy; Milius, Robert P; Slack, MaryAnn; Wold, Diane; Glickman, Michael L; Brodsky, Boris; Jaffe, Charles; Kush, Rebecca; Helton, Edward
2017-09-01
It is critical to integrate and analyze data from biological, translational, and clinical studies with data from health systems; however, electronic artifacts are stored in thousands of disparate systems that are often unable to readily exchange data. To facilitate meaningful data exchange, a model that presents a common understanding of biomedical research concepts and their relationships with health care semantics is required. The Biomedical Research Integrated Domain Group (BRIDG) domain information model fulfills this need. Software systems created from BRIDG have shared meaning "baked in," enabling interoperability among disparate systems. For nearly 10 years, the Clinical Data Standards Interchange Consortium, the National Cancer Institute, the US Food and Drug Administration, and Health Level 7 International have been key stakeholders in developing BRIDG. BRIDG is an open-source Unified Modeling Language-class model developed through use cases and harmonization with other models. With its 4+ releases, BRIDG includes clinical and now translational research concepts in its Common, Protocol Representation, Study Conduct, Adverse Events, Regulatory, Statistical Analysis, Experiment, Biospecimen, and Molecular Biology subdomains. The model is a Clinical Data Standards Interchange Consortium, Health Level 7 International, and International Standards Organization standard that has been utilized in national and international standards-based software development projects. It will continue to mature and evolve in the areas of clinical imaging, pathology, ontology, and vocabulary support. BRIDG 4.1.1 and prior releases are freely available at https://bridgmodel.nci.nih.gov . © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Method to control artifacts of microstructural fabrication
Shul, Randy J.; Willison, Christi G.; Schubert, W. Kent; Manginell, Ronald P.; Mitchell, Mary-Anne; Galambos, Paul C.
2006-09-12
New methods for fabrication of silicon microstructures have been developed. In these methods, an etching delay layer is deposited and patterned so as to provide differential control on the depth of features being etched into a substrate material. Compensation for etching-related structural artifacts can be accomplished by proper use of such an etching delay layer.
ERIC Educational Resources Information Center
Braswell, Gregory S.
2015-01-01
This exploratory study examined children's experiences with producing and comprehending external representations in a preschool classroom. Data collection and analyses focused on how artifacts, spaces, adult-guided routines, and social conventions shape young children's representational development. Participants included 4- and…
Reasserting the Fundamentals of Systems Analysis and Design through the Rudiments of Artifacts
ERIC Educational Resources Information Center
Jafar, Musa; Babb, Jeffry
2012-01-01
In this paper we present an artifacts-based approach to teaching a senior level Object-Oriented Analysis and Design course. Regardless of the systems development methodology and process model, and in order to facilitate communication across the business modeling, analysis, design, construction and deployment disciplines, we focus on (1) the…
A robust adaptive denoising framework for real-time artifact removal in scalp EEG measurements
NASA Astrophysics Data System (ADS)
Kilicarslan, Atilla; Grossman, Robert G.; Contreras-Vidal, Jose Luis
2016-04-01
Objective. Non-invasive measurement of human neural activity based on the scalp electroencephalogram (EEG) allows for the development of biomedical devices that interface with the nervous system for scientific, diagnostic, therapeutic, or restorative purposes. However, EEG recordings are often considered as prone to physiological and non-physiological artifacts of different types and frequency characteristics. Among them, ocular artifacts and signal drifts represent major sources of EEG contamination, particularly in real-time closed-loop brain-machine interface (BMI) applications, which require effective handling of these artifacts across sessions and in natural settings. Approach. We extend the usage of a robust adaptive noise cancelling (ANC) scheme ({H}∞ filtering) for removal of eye blinks, eye motions, amplitude drifts and recording biases simultaneously. We also characterize the volume conduction, by estimating the signal propagation levels across all EEG scalp recording areas due to ocular artifact generators. We find that the amplitude and spatial distribution of ocular artifacts vary greatly depending on the electrode location. Therefore, fixed filtering parameters for all recording areas would naturally hinder the true overall performance of an ANC scheme for artifact removal. We treat each electrode as a separate sub-system to be filtered, and without the loss of generality, they are assumed to be uncorrelated and uncoupled. Main results. Our results show over 95-99.9% correlation between the raw and processed signals at non-ocular artifact regions, and depending on the contamination profile, 40-70% correlation when ocular artifacts are dominant. We also compare our results with the offline independent component analysis and artifact subspace reconstruction methods, and show that some local quantities are handled better by our sample-adaptive real-time framework. Decoding performance is also compared with multi-day experimental data from 2 subjects, totaling 19 sessions, with and without {H}∞ filtering of the raw data. Significance. The proposed method allows real-time adaptive artifact removal for EEG-based closed-loop BMI applications and mobile EEG studies in general, thereby increasing the range of tasks that can be studied in action and context while reducing the need for discarding data due to artifacts. Significant increase in decoding performances also justify the effectiveness of the method to be used in real-time closed-loop BMI applications.
Cömert, Alper; Hyttinen, Jari
2015-05-15
With advances in technology and increasing demand, wearable biosignal monitoring is developing and new applications are emerging. One of the main challenges facing the widespread use of wearable monitoring systems is the motion artifact. The sources of the motion artifact lie in the skin-electrode interface. Reducing the motion and deformation at this interface should have positive effects on signal quality. In this study, we aim to investigate whether the structure supporting the electrode can be designed to reduce the motion artifact with the hypothesis that this can be achieved by stabilizing the skin deformations around the electrode. We compare four textile electrodes with different support structure designs: a soft padding larger than the electrode area, a soft padding larger than the electrode area with a novel skin deformation restricting design, a soft padding the same size as the electrode area, and a rigid support the same size as the electrode. With five subjects and two electrode locations placed over different kinds of tissue at various mounting forces, we simultaneously measured the motion artifact, a motion affected ECG, and the real-time skin-electrode impedance during the application of controlled motion to the electrodes. The design of the electrode support structure has an effect on the generated motion artifact; good design with a skin stabilizing structure makes the electrodes physically more motion artifact resilient, directly affecting signal quality. Increasing the applied mounting force shows a positive effect up to 1,000 gr applied force. The properties of tissue under the electrode are an important factor in the generation of the motion artifact and the functioning of the electrodes. The relationship of motion artifact amplitude to the electrode movement magnitude is seen to be linear for smaller movements. For larger movements, the increase of motion generated a disproportionally larger artifact. The motion artifact and the induced impedance change were caused by the electrode motion and contained the same frequency components as the applied electrode motion pattern. We found that stabilizing the skin around the electrode using an electrode structure that manages to successfully distribute the force and movement to an area beyond the borders of the electrical contact area reduces the motion artifact when compared to structures that are the same size as the electrode area.
Revisiting the radio interferometer measurement equation. I. A full-sky Jones formalism
NASA Astrophysics Data System (ADS)
Smirnov, O. M.
2011-03-01
Context. Since its formulation by Hamaker et al., the radio interferometer measurement equation (RIME) has provided a rigorous mathematical basis for the development of novel calibration methods and techniques, including various approaches to the problem of direction-dependent effects (DDEs). However, acceptance of the RIME in the radio astronomical community at large has been slow, which is partially due to the limited availability of software to exploit its power, and the sparsity of practical results. This needs to change urgently. Aims: This series of papers aims to place recent developments in the treatment of DDEs into one RIME-based mathematical framework, and to demonstrate the ease with which the various effects can be described and understood. It also aims to show the benefits of a RIME-based approach to calibration. Methods: Paper I re-derives the RIME from first principles, extends the formalism to the full-sky case, and incorporates DDEs. Paper II then uses the formalism to describe self-calibration, both with a full RIME, and with the approximate equations of older software packages, and shows how this is affected by DDEs. It also gives an overview of real-life DDEs and proposed methods of dealing with them. Finally, in Paper III some of these methods are exercised to achieve an extremely high-dynamic range calibration of WSRT observations of 3C 147 at 21 cm, with full treatment of DDEs. Results: The RIME formalism is extended to the full-sky case (Paper I), and is shown to be an elegant way of describing calibration and DDEs (Paper II). Applying this to WSRT data (Paper III) results in a noise-limited image of the field around 3C 147 with a very high dynamic range (1.6 million), and none of the off-axis artifacts that plague regular selfcal. The resulting differential gain solutions contain significant information on DDEs and errors in the sky model. Conclusions: The RIME is a powerful formalism for describing radio interferometry, and underpins the development of novel calibration methods, in particular those dealing with DDEs. One of these is the differential gains approach used for the 3C 147 reduction. Differential gains can eliminate DDE-related artifacts, and provide information for iterative improvements of sky models. Perhaps most importantly, sources as faint as 2 mJy have been shown to yield meaningful differential gain solutions, and thus can be used as potential calibration beacons in other DDE-related schemes.
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Qualitative Evaluation of Fiducial Markers for Radiotherapy Imaging
Chan, Maria F.; Cohen, Gil’ad N.; Deasy, Joseph O.
2016-01-01
Purpose To evaluate visibility, artifacts, and distortions of various commercial markers in magnetic resonance imaging (MRI), computer tomography (CT), and ultrasound imaging used for radiotherapy planning and treatment guidance. Methods We compare 2 solid gold markers, 4 gold coils, and 1 polymer marker from 3 vendors. Imaging modalities used were 3-T and 1.5-T GE MRIs, Siemens Sequoia 512 Ultrasound, Phillips Big Bore CT, Varian Trilogy linear accelerator (cone-beam CT [CBCT], on-board imager kilovoltage [OBI-kV], electronic portal imaging device megavoltage [EPID-MV]), and Medtronic O-ARM CBCT. Markers were imaged in a 30 × 30 × 10 cm3 custom bolus phantom. In one experiment, Surgilube was used around the markers to reduce air gaps. Images were saved in Digital Imaging and Communications in Medicine (DICOM) format and analyzed using an in-house software. Profiles across the markers were used for objective comparison of the markers’ signals. The visibility and artifacts/distortions produced by each marker were assessed qualitatively and quantitatively. Results All markers are visible in CT, CBCT, OBI-kV, and ultrasound. Gold markers below 0.75 mm in diameter are not visible in EPID-MV images. The larger the markers, the more CT and CBCT image artifacts there are, yet the degree of the artifact depends on scan parameters and the scanner itself. Visibility of gold coils of 0.75 mm diameter or larger is comparable across all imaging modalities studied. The polymer marker causes minimal artifacts in CT and CBCT but has poor visibility in EPID-MV. Gold coils of 0.5 mm exhibit poor visibility in MRI and EPID-MV due to their small size. Gold markers are more visible in 3-T T1 gradient-recalled echo than in 1.5-T T1 fast spin-echo, depending on the scan sequence. In this study, all markers are clearly visible on ultrasound. Conclusion All gold markers are visible in CT, CBCT, kV, and ultrasound; however, only the large diameter markers are visible in MV. When MR and EPID-MV imagers are used, the selection of fiducial markers is not straightforward. For hybrid kV/MV image-guided radiotherapy imaging, larger diameter markers are suggested. If using kV imaging alone, smaller sized markers may be used in smaller sized patients in order to reduce artifacts. Only larger diameter gold markers are visible across all imaging modalities. PMID:25230715
An Approach to Building a Traceability Tool for Software Development
NASA Technical Reports Server (NTRS)
Delgado, Nelly; Watson, Tom
1997-01-01
It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional specifications, design reports, and system code. Tracing helps 1) validate system features against, the requirement specification, 2) identify error sources and, most importantly, 3) manage change. With so many people involved in the development of the system, it becomes necessary to identify the reasons behind the design requirements or the implementation decisions. This paper is concerned with an approach that maps documents to constraints that capture properties of and relationships between the objects being modeled by the program. Section 2 provides the reader with a background on traceability tools. Section 3 gives a brief description of the context monitoring system on which the approach suggested in this paper is based. Section 4 presents an overview of our approach to providing traceability. The last section presents our future direction of research.
What is technology? A study of fifth and eighth grade student ideas about the Nature of Technology
NASA Astrophysics Data System (ADS)
Digironimo, Nicole
Most, if not all, standards for science and technology education and curriculum indicate that knowledge of the Nature of Technology is an educational goal, yet the literature lacks an established definition for the Nature of Technology. Additionally, the research on student ideas about the Nature of Technology is insufficient. After reviewing the literature on science and technology education, the philosophy of technology, and the history of technology, this study presents an internally consistent definition for the Nature of Technology. This definition illustrates how the Nature of Technology includes five dimensions: Technology as Artifacts; Technology as a Creation Process; Technology as a Human Practice; The History of Technology; and The Current Role of Technology. Using an interview protocol developed for this study, a small group of 5th and 8th grade students were interviewed to ascertain their ideas about the Nature of Technology. The results indicate that there are a variety of ideas present in the thinking of young people. All of the participants expressed one of two ideas about technological artifacts: technological artifacts are electronic or technological artifacts are invented. All of the participants identified particular skills needed to invent technological artifacts; some of the participants included mobility and concluded that disabled people cannot be inventors. Despite their experiences with technological artifacts (including educational technology tools), a few of the participants were uncertain whether they would identify themselves as technological. More than half of the participants did not believe older artifacts can still be considered technology. Most of the participants were apprehensive about our technological future; the main issue expressed by the participants was the environment. Other than environmental concerns, most of the participants were unable to identify global issues regarding technological use and development. Overall, these findings increase our knowledge of the ideas young people have about the Nature of Technology, which can inform future research on teaching and learning about science and technology.
NASA Astrophysics Data System (ADS)
Martin, E. R.; Dou, S.; Lindsey, N.; Chang, J. P.; Biondi, B. C.; Ajo Franklin, J. B.; Wagner, A. M.; Bjella, K.; Daley, T. M.; Freifeld, B. M.; Robertson, M.; Ulrich, C.; Williams, E. F.
2016-12-01
Localized strong sources of noise in an array have been shown to cause artifacts in Green's function estimates obtained via cross-correlation. Their effect is often reduced through the use of cross-coherence. Beyond independent localized sources, temporally or spatially correlated sources of noise frequently occur in practice but violate basic assumptions of much of the theory behind ambient noise Green's function retrieval. These correlated noise sources can occur in urban environments due to transportation infrastructure, or in areas around industrial operations like pumps running at CO2 sequestration sites or oil and gas drilling sites. Better understanding of these artifacts should help us develop and justify methods for their automatic removal from Green's function estimates. We derive expected artifacts in cross-correlations from several distributions of correlated noise sources including point sources that are exact time-lagged repeats of each other and Gaussian-distributed in space and time with covariance that exponentially decays. Assuming the noise distribution stays stationary over time, the artifacts become more coherent as more ambient noise is included in the Green's function estimates. We support our results with simple computational models. We observed these artifacts in Green's function estimates from a 2015 ambient noise study in Fairbanks, AK where a trenched distributed acoustic sensing (DAS) array was deployed to collect ambient noise alongside a road with the goal of developing a permafrost thaw monitoring system. We found that joints in the road repeatedly being hit by cars travelling at roughly the speed limit led to artifacts similar to those expected when several points are time-lagged copies of each other. We also show test results of attenuating the effects of these sources during time-lapse monitoring of an active thaw test in the same location with noise detected by a 2D trenched DAS array.
Miksys, N; Xu, C; Beaulieu, L; Thomson, R M
2015-08-07
This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose calculation studies for various permanent implant brachytherapy treatments.
Fischer, Christoph; Domer, Benno; Wibmer, Thomas; Penzel, Thomas
2017-03-01
Photoplethysmography has been used in a wide range of medical devices for measuring oxygen saturation, cardiac output, assessing autonomic function, and detecting peripheral vascular disease. Artifacts can render the photoplethysmogram (PPG) useless. Thus, algorithms capable of identifying artifacts are critically important. However, the published PPG algorithms are limited in algorithm and study design. Therefore, the authors developed a novel embedded algorithm for real-time pulse waveform (PWF) segmentation and artifact detection based on a contour analysis in the time domain. This paper provides an overview about PWF and artifact classifications, presents the developed PWF analysis, and demonstrates the implementation on a 32-bit ARM core microcontroller. The PWF analysis was validated with data records from 63 subjects acquired in a sleep laboratory, ergometry laboratory, and intensive care unit in equal parts. The output of the algorithm was compared with harmonized experts' annotations of the PPG with a total duration of 31.5 h. The algorithm achieved a beat-to-beat comparison sensitivity of 99.6%, specificity of 90.5%, precision of 98.5%, and accuracy of 98.3%. The interrater agreement expressed as Cohen's kappa coefficient was 0.927 and as F-measure was 0.990. In conclusion, the PWF analysis seems to be a suitable method for PPG signal quality determination, real-time annotation, data compression, and calculation of additional pulse wave metrics such as amplitude, duration, and rise time.
Zhang, Jie; Fan, Xinghua; Graham, Lisa; Chan, Tak W; Brook, Jeffrey R
2013-01-01
Sampling of particle-phase organic carbon (OC) from diesel engines is complicated by adsorption and evaporation of semivolatile organic carbon (SVOC), defined as positive and negative artifacts, respectively. In order to explore these artifacts, an integrated organic gas and particle sampler (IOGAPS) was applied, in which an XAD-coated multichannel annular denuder was placed upstream to remove the gas-phase SVOC and two downstream sorbent-impregnated filters (SIFs) were employed to capture the evaporated SVOC. Positive artifacts can be reduced by using a denuder but particle loss also occurs. This paper investigates the IOGAPS with respect to particle loss, denuder efficiency, and particle-phase OC artifacts by comparing OC, elemental carbon (EC), SVOC, and selected organic species, as well as particle size distributions. Compared to the filterpack methods typically used, the IOGAPS approach results in estimation of both positive and negative artifacts, especially the negative artifact. The positive and negative artifacts were 190 microg/m3 and 67 microg/m3, representing 122% and 43% of the total particle OC measured by the IOGAPS, respectively. However particle loss and denuder break-through were also found to exist. Monitoring particle mass loss by particle number or EC concentration yielded similar results ranging from 10% to 24% depending upon flow rate. Using the measurements of selected particle-phase organic species to infer particle loss resulted in larger estimates, on the order of 32%. The denuder collection efficiencyfor SVOCs at 74 L/min was found to be less than 100%, with an average of 84%. In addition to these uncertainties the IOGAPS method requires a considerable amount of extra effort to apply. These disadvantages must be weighed against the benefits of being able to estimate positive artifacts and correct, with some uncertainty, for the negative artifacts when selecting a method for sampling diesel emissions. Measurements of diesel emissions are necessary to understand their adverse impacts. Much of the emissions is organic carbon covering a range ofvolatilities, complicating determination of the particle fraction because of sampling artifacts. In this paper an approach to quantify artifacts is evaluated for a diesel engine. This showed that 63% of the particle organic carbon typically measured could be the positive artifact while the negative artifact is about one-third of this value. However, this approach adds time and expense and leads to other uncertainties, implying that effort is needed to develop methods to accurately measure diesel emissions.
Accounting Artifacts in High-Throughput Toxicity Assays.
Hsieh, Jui-Hua
2016-01-01
Compound activity identification is the primary goal in high-throughput screening (HTS) assays. However, assay artifacts including both systematic (e.g., compound auto-fluorescence) and nonsystematic (e.g., noise) complicate activity interpretation. In addition, other than the traditional potency parameter, half-maximal effect concentration (EC50), additional activity parameters (e.g., point-of-departure, POD) could be derived from HTS data for activity profiling. A data analysis pipeline has been developed to handle the artifacts and to provide compound activity characterization with either binary or continuous metrics. This chapter outlines the steps in the pipeline using Tox21 glucocorticoid receptor (GR) β-lactamase assays, including the formats to identify either agonists or antagonists, as well as the counter-screen assays for identifying artifacts as examples. The steps can be applied to other lower-throughput assays with concentration-response data.
Bosy-Westphal, Anja; Danielzik, Sandra; Becker, Christine; Geisler, Corinna; Onur, Simone; Korth, Oliver; Bührens, Frederike; Müller, Manfred J
2005-09-01
Air-displacement plethysmography (ADP) is now widely used for body composition measurement in pediatric populations. However, the manufacturer's software developed for adults leaves a potential bias for application in children and adolescents, and recent publications do not consistently use child-specific corrections. Therefore we analyzed child-specific ADP corrections with respect to quantity and etiology of bias compared with adult formulas. An optimal correction protocol is provided giving step-by-step instructions for calculations. In this study, 258 children and adolescents (143 girls and 115 boys ranging from 5 to 18 y) with a high prevalence of overweight or obesity (28.0% in girls and 22.6% in boys) were examined by ADP applying the manufacturer's software as well as published equations for child-specific corrections for surface area artifact (SAA), thoracic gas volume (TGV), and density of fat-free mass (FFM). Compared with child-specific equations for SAA, TGV, and density of FFM, the mean overestimation of the percentage of fat mass using the manufacturer's software was 10% in children and adolescents. Half of the bias derived from the use of Siri's equation not corrected for age-dependent differences in FFM density. An additional 3 and 2% of bias resulted from the application of adult equations for prediction of SAA and TGV, respectively. Different child-specific equations used to predict TGV did not differ in the percentage of fat mass. We conclude that there is a need for child-specific equations in ADP raw data analysis considering SAA, TGV, and density of FFM.
Design of Scalable and Effective Earth Science Collaboration Tool
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.
2014-12-01
Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).
ACHIEVING CONSISTENT DOPPLER MEASUREMENTS FROM SDO /HMI VECTOR FIELD INVERSIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuck, Peter W.; Antiochos, S. K.; Leka, K. D.
NASA’s Solar Dynamics Observatory is delivering vector magnetic field observations of the full solar disk with unprecedented temporal and spatial resolution; however, the satellite is in a highly inclined geosynchronous orbit. The relative spacecraft–Sun velocity varies by ±3 km s{sup −1} over a day, which introduces major orbital artifacts in the Helioseismic Magnetic Imager (HMI) data. We demonstrate that the orbital artifacts contaminate all spatial and temporal scales in the data. We describe a newly developed three-stage procedure for mitigating these artifacts in the Doppler data obtained from the Milne–Eddington inversions in the HMI pipeline. The procedure ultimately uses 32more » velocity-dependent coefficients to adjust 10 million pixels—a remarkably sparse correction model given the complexity of the orbital artifacts. This procedure was applied to full-disk images of AR 11084 to produce consistent Dopplergrams. The data adjustments reduce the power in the orbital artifacts by 31 dB. Furthermore, we analyze in detail the corrected images and show that our procedure greatly improves the temporal and spectral properties of the data without adding any new artifacts. We conclude that this new procedure makes a dramatic improvement in the consistency of the HMI data and in its usefulness for precision scientific studies.« less
Preparation of an Au-Pt alloy free from artifacts in magnetic resonance imaging.
Kodama, Tomonobu; Nakai, Ryusuke; Goto, Kenji; Shima, Kunihiro; Iwata, Hiroo
2017-12-01
When magnetic resonance imaging (MRI) is performed on patients carrying metallic implants, artifacts can disturb the images around the implants, often making it difficult to interpret them appropriately. However, metallic materials are and will be indispensable as raw materials for medical devices because of their electric conductivity, visibility under X-ray fluoroscopy, and other favorable features. What is now desired is to develop a metallic material which causes no artifacts during MRI. In the present study, we prepared a single-phase and homogeneous Au-Pt alloys (Au; diamagnetic metal, and Pt; paramagnetic metal) by the processing of thermal treatment. Volume magnetic susceptibility was measured with a SQUID Flux Meter and MRI artifact was evaluated using a 1.5-T scanner. After final thermal treatment, an entirely recrystallized homogeneous organization was noted. The Au-35Pt alloy was shown to have a volume magnetic susceptibility of -8.8ppm, causing almost free from artifacts during MRI. We thus prepared an Au-35Pt alloy which had a magnetic susceptibility very close to that of living tissue and caused much fewer artifacts during MRI. It is promising as a material for spinal cages, intracranial electrodes, cerebral aneurysm embolization coils, markers for MRI and so on. Copyright © 2017 Elsevier Inc. All rights reserved.
Rashid, Shams; Rapacchi, Stanislas; Shivkumar, Kalyanam; Plotnik, Adam; Finn, J. Paul; Hu, Peng
2015-01-01
Purpose To study the effects of cardiac devices on three-dimensional (3D) late gadolinium enhancement (LGE) MRI and to develop a 3D LGE protocol for implantable cardioverter defibrillator (ICD) patients with reduced image artifacts. Theory and Methods The 3D LGE sequence was modified by implementing a wideband inversion pulse, which reduces hyperintensity artifacts, and by increasing bandwidth of the excitation pulse. The modified wideband 3D LGE sequence was tested in phantoms and evaluated in six volunteers and five patients with ICDs. Results Phantom and in vivo studies results demonstrated extended signal void and ripple artifacts in 3D LGE that were associated with ICDs. The reason for these artifacts was slab profile distortion and the subsequent aliasing in the slice-encoding direction. The modified wideband 3D LGE provided significantly reduced ripple artifacts than 3D LGE with wideband inversion only. Comparison of 3D and 2D LGE images demonstrated improved spatial resolution of the heart using 3D LGE. Conclusion Increased bandwidth of the inversion and excitation pulses can significantly reduce image artifacts associated with ICDs. Our modified wideband 3D LGE protocol can be readily used for imaging patients with ICDs given appropriate safety guidelines are followed. PMID:25772155
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hanming; Wang, Linyuan; Li, Lei
2016-06-15
Purpose: Metal artifact reduction (MAR) is a major problem and a challenging issue in x-ray computed tomography (CT) examinations. Iterative reconstruction from sinograms unaffected by metals shows promising potential in detail recovery. This reconstruction has been the subject of much research in recent years. However, conventional iterative reconstruction methods easily introduce new artifacts around metal implants because of incomplete data reconstruction and inconsistencies in practical data acquisition. Hence, this work aims at developing a method to suppress newly introduced artifacts and improve the image quality around metal implants for the iterative MAR scheme. Methods: The proposed method consists of twomore » steps based on the general iterative MAR framework. An uncorrected image is initially reconstructed, and the corresponding metal trace is obtained. The iterative reconstruction method is then used to reconstruct images from the unaffected sinogram. In the reconstruction step of this work, an iterative strategy utilizing unmatched projector/backprojector pairs is used. A ramp filter is introduced into the back-projection procedure to restrain the inconsistency components in low frequencies and generate more reliable images of the regions around metals. Furthermore, a constrained total variation (TV) minimization model is also incorporated to enhance efficiency. The proposed strategy is implemented based on an iterative FBP and an alternating direction minimization (ADM) scheme, respectively. The developed algorithms are referred to as “iFBP-TV” and “TV-FADM,” respectively. Two projection-completion-based MAR methods and three iterative MAR methods are performed simultaneously for comparison. Results: The proposed method performs reasonably on both simulation and real CT-scanned datasets. This approach could reduce streak metal artifacts effectively and avoid the mentioned effects in the vicinity of the metals. The improvements are evaluated by inspecting regions of interest and by comparing the root-mean-square errors, normalized mean absolute distance, and universal quality index metrics of the images. Both iFBP-TV and TV-FADM methods outperform other counterparts in all cases. Unlike the conventional iterative methods, the proposed strategy utilizing unmatched projector/backprojector pairs shows excellent performance in detail preservation and prevention of the introduction of new artifacts. Conclusions: Qualitative and quantitative evaluations of experimental results indicate that the developed method outperforms classical MAR algorithms in suppressing streak artifacts and preserving the edge structural information of the object. In particular, structures lying close to metals can be gradually recovered because of the reduction of artifacts caused by inconsistency effects.« less
Determination of Orbital Parameters for Visual Binary Stars Using a Fourier-Series Approach
NASA Astrophysics Data System (ADS)
Brown, D. E.; Prager, J. R.; DeLeo, G. G.; McCluskey, G. E., Jr.
2001-12-01
We expand on the Fourier transform method of Monet (ApJ 234, 275, 1979) to infer the orbital parameters of visual binary stars, and we present results for several systems, both simulated and real. Although originally developed to address binary systems observed through at least one complete period, we have extended the method to deal explicitly with cases where the orbital data is less complete. This is especially useful in cases where the period is so long that only a fragment of the orbit has been recorded. We utilize Fourier-series fitting methods appropriate to data sets covering less than one period and containing random measurement errors. In so doing, we address issues of over-determination in fitting the data and the reduction of other deleterious Fourier-series artifacts. We developed our algorithm using the MAPLE mathematical software code, and tested it on numerous "synthetic" systems, and several real binaries, including Xi Boo, 24 Aqr, and Bu 738. This work was supported at Lehigh University by the Delaware Valley Space Grant Consortium and by NSF-REU grant PHY-9820301.
McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F
2017-04-15
Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Lee, Jinseok; McManus, David D; Merchant, Sneh; Chon, Ki H
2012-06-01
We present a real-time method for the detection of motion and noise (MN) artifacts, which frequently interferes with accurate rhythm assessment when ECG signals are collected from Holter monitors. Our MN artifact detection approach involves two stages. The first stage involves the use of the first-order intrinsic mode function (F-IMF) from the empirical mode decomposition to isolate the artifacts' dynamics as they are largely concentrated in the higher frequencies. The second stage of our approach uses three statistical measures on the F-IMF time series to look for characteristics of randomness and variability, which are hallmark signatures of MN artifacts: the Shannon entropy, mean, and variance. We then use the receiver-operator characteristics curve on Holter data from 15 healthy subjects to derive threshold values associated with these statistical measures to separate between the clean and MN artifacts' data segments. With threshold values derived from 15 training data sets, we tested our algorithms on 30 additional healthy subjects. Our results show that our algorithms are able to detect the presence of MN artifacts with sensitivity and specificity of 96.63% and 94.73%, respectively. In addition, when we applied our previously developed algorithm for atrial fibrillation (AF) detection on those segments that have been labeled to be free from MN artifacts, the specificity increased from 73.66% to 85.04% without loss of sensitivity (74.48%-74.62%) on six subjects diagnosed with AF. Finally, the computation time was less than 0.2 s using a MATLAB code, indicating that real-time application of the algorithms is possible for Holter monitoring.
Dagalakis, Nicholas G.; Yoo, Jae Myung; Oeste, Thomas
2017-01-01
The Dynamic Impact Testing and Calibration Instrument (DITCI) is a simple instrument with a significant data collection and analysis capability that is used for the testing and calibration of biosimulant human tissue artifacts. These artifacts may be used to measure the severity of injuries caused in the case of a robot impact with a human. In this paper we describe the DITCI adjustable impact and flexible foundation mechanism, which allows the selection of a variety of impact force levels and foundation stiffness. The instrument can accommodate arrays of a variety of sensors and impact tools, simulating both real manufacturing tools and the testing requirements of standards setting organizations. A computer data acquisition system may collect a variety of impact motion, force, and torque data, which are used to develop a variety of mathematical model representations of the artifacts. Finally, we describe the fabrication and testing of human abdomen soft tissue artifacts, used to display the magnitude of impact tissue deformation. Impact tests were performed at various maximum impact force and average pressure levels. PMID:28579658
Symeonidou, Evangelia-Regkina; Nordin, Andrew D; Hairston, W David; Ferris, Daniel P
2018-04-03
More neuroscience researchers are using scalp electroencephalography (EEG) to measure electrocortical dynamics during human locomotion and other types of movement. Motion artifacts corrupt the EEG and mask underlying neural signals of interest. The cause of motion artifacts in EEG is often attributed to electrode motion relative to the skin, but few studies have examined EEG signals under head motion. In the current study, we tested how motion artifacts are affected by the overall mass and surface area of commercially available electrodes, as well as how cable sway contributes to motion artifacts. To provide a ground-truth signal, we used a gelatin head phantom with embedded antennas broadcasting electrical signals, and recorded EEG with a commercially available electrode system. A robotic platform moved the phantom head through sinusoidal displacements at different frequencies (0-2 Hz). Results showed that a larger electrode surface area can have a small but significant effect on improving EEG signal quality during motion and that cable sway is a major contributor to motion artifacts. These results have implications in the development of future hardware for mobile brain imaging with EEG.
Sinkiewicz, Daniel; Friesen, Lendra; Ghoraani, Behnaz
2017-02-01
Cortical auditory evoked potentials (CAEP) are used to evaluate cochlear implant (CI) patient auditory pathways, but the CI device produces an electrical artifact, which obscures the relevant information in the neural response. Currently there are multiple methods, which attempt to recover the neural response from the contaminated CAEP, but there is no gold standard, which can quantitatively confirm the effectiveness of these methods. To address this crucial shortcoming, we develop a wavelet-based method to quantify the amount of artifact energy in the neural response. In addition, a novel technique for extracting the neural response from single channel CAEPs is proposed. The new method uses matching pursuit (MP) based feature extraction to represent the contaminated CAEP in a feature space, and support vector machines (SVM) to classify the components as normal hearing (NH) or artifact. The NH components are combined to recover the neural response without artifact energy, as verified using the evaluation tool. Although it needs some further evaluation, this approach is a promising method of electrical artifact removal from CAEPs. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Dagalakis, Nicholas G; Yoo, Jae Myung; Oeste, Thomas
2016-01-01
The Dynamic Impact Testing and Calibration Instrument (DITCI) is a simple instrument with a significant data collection and analysis capability that is used for the testing and calibration of biosimulant human tissue artifacts. These artifacts may be used to measure the severity of injuries caused in the case of a robot impact with a human. In this paper we describe the DITCI adjustable impact and flexible foundation mechanism, which allows the selection of a variety of impact force levels and foundation stiffness. The instrument can accommodate arrays of a variety of sensors and impact tools, simulating both real manufacturing tools and the testing requirements of standards setting organizations. A computer data acquisition system may collect a variety of impact motion, force, and torque data, which are used to develop a variety of mathematical model representations of the artifacts. Finally, we describe the fabrication and testing of human abdomen soft tissue artifacts, used to display the magnitude of impact tissue deformation. Impact tests were performed at various maximum impact force and average pressure levels.
B-mode Ultrasound Versus Color Doppler Twinkling Artifact in Detecting Kidney Stones
Harper, Jonathan D.; Hsi, Ryan S.; Shah, Anup R.; Dighe, Manjiri K.; Carter, Stephen J.; Moshiri, Mariam; Paun, Marla; Lu, Wei; Bailey, Michael R.
2013-01-01
Abstract Purpose To compare color Doppler twinkling artifact and B-mode ultrasonography in detecting kidney stones. Patients and Methods Nine patients with recent CT scans prospectively underwent B-mode and twinkling artifact color Doppler ultrasonography on a commercial ultrasound machine. Video segments of the upper pole, interpolar area, and lower pole were created, randomized, and independently reviewed by three radiologists. Receiver operator characteristics were determined. Results There were 32 stones in 18 kidneys with a mean stone size of 8.9±7.5 mm. B-mode ultrasonography had 71% sensitivity, 48% specificity, 52% positive predictive value, and 68% negative predictive value, while twinkling artifact Doppler ultrasonography had 56% sensitivity, 74% specificity, 62% positive predictive value, and 68% negative predictive value. Conclusions When used alone, B-mode is more sensitive, but twinkling artifact is more specific in detecting kidney stones. This information may help users employ twinkling and B-mode to identify stones and developers to improve signal processing to harness the fundamental acoustic differences to ultimately improve stone detection. PMID:23067207
Dong, Jian; Hayakawa, Yoshihiko; Kannenberg, Sven; Kober, Cornelia
2013-02-01
The objective of this study was to reduce metal-induced streak artifact on oral and maxillofacial x-ray computed tomography (CT) images by developing the fast statistical image reconstruction system using iterative reconstruction algorithms. Adjacent CT images often depict similar anatomical structures in thin slices. So, first, images were reconstructed using the same projection data of an artifact-free image. Second, images were processed by the successive iterative restoration method where projection data were generated from reconstructed image in sequence. Besides the maximum likelihood-expectation maximization algorithm, the ordered subset-expectation maximization algorithm (OS-EM) was examined. Also, small region of interest (ROI) setting and reverse processing were applied for improving performance. Both algorithms reduced artifacts instead of slightly decreasing gray levels. The OS-EM and small ROI reduced the processing duration without apparent detriments. Sequential and reverse processing did not show apparent effects. Two alternatives in iterative reconstruction methods were effective for artifact reduction. The OS-EM algorithm and small ROI setting improved the performance. Copyright © 2012 Elsevier Inc. All rights reserved.
3D OCT imaging in clinical settings: toward quantitative measurements of retinal structures
NASA Astrophysics Data System (ADS)
Zawadzki, Robert J.; Fuller, Alfred R.; Zhao, Mingtao; Wiley, David F.; Choi, Stacey S.; Bower, Bradley A.; Hamann, Bernd; Izatt, Joseph A.; Werner, John S.
2006-02-01
The acquisition speed of current FD-OCT (Fourier Domain - Optical Coherence Tomography) instruments allows rapid screening of three-dimensional (3D) volumes of human retinas in clinical settings. To take advantage of this ability requires software used by physicians to be capable of displaying and accessing volumetric data as well as supporting post processing in order to access important quantitative information such as thickness maps and segmented volumes. We describe our clinical FD-OCT system used to acquire 3D data from the human retina over the macula and optic nerve head. B-scans are registered to remove motion artifacts and post-processed with customized 3D visualization and analysis software. Our analysis software includes standard 3D visualization techniques along with a machine learning support vector machine (SVM) algorithm that allows a user to semi-automatically segment different retinal structures and layers. Our program makes possible measurements of the retinal layer thickness as well as volumes of structures of interest, despite the presence of noise and structural deformations associated with retinal pathology. Our software has been tested successfully in clinical settings for its efficacy in assessing 3D retinal structures in healthy as well as diseased cases. Our tool facilitates diagnosis and treatment monitoring of retinal diseases.
Incorporating a Soil Science Artifact into a University ePortfolio Assessment Tool
ERIC Educational Resources Information Center
Mikhailova, Elena; Werts, Joshua; Post, Christopher; Ring, Gail
2014-01-01
The ePortfolio is a useful educational tool that is utilized in many educational institutions to showcase student accomplishments and provide students with an opportunity to reflect on their educational progress. The objective of this study was to develop and test an artifact from an introductory soil science course to be included in the…
A Historical Analysis of the Curriculum of Organic Chemistry Using ACS Exams as Artifacts
ERIC Educational Resources Information Center
Raker, Jeffrey R.; Holme, Thomas A.
2013-01-01
Standardized examinations, such as those developed and disseminated by the ACS Examinations Institute, are artifacts of the teaching of a course and over time may provide a historical perspective on how curricula have changed and evolved. This study investigated changes in organic chemistry curricula across a 60-year period by evaluating 18 ACS…
ERIC Educational Resources Information Center
Firat, Mehmet
2017-01-01
Knowledge of technology is an educational goal of science education. A primary way of increasing technology literacy in a society is to develop students' conception of technology starting from their elementary school years. However, there is a lack of research on student recognition of and reasoning about technology and technological artifacts. In…
User’s Manual for ProbeCorder (Version 1.0) Data Collection Software
1997-03-27
unit or soil horizon are ’’inked" onl to the drawing pad at the appropriate depth and then each unit or deposit is assigned an Arabic numeral f’rom...recorniniendod that simple Arabic numnerals lhe LA (I foi- this ptirpiose- and that no more thimn thrce cli its are used( (i e.,I Y) T ’he ’Recorder...below. ’~ Cu~stomnize Your CSamplinig Gaooer T OXtuwo List CtArtifact Lst O Number of Artifnct 0 Featltwes List Step 2 (Choosev’ pick. liýt t Porn
NASA Astrophysics Data System (ADS)
Choi, Myoung-Hwan; Ahn, Jungryul; Park, Dae Jin; Lee, Sang Min; Kim, Kwangsoo; Cho, Dong-il Dan; Senok, Solomon S.; Koo, Kyo-in; Goo, Yong Sook
2017-02-01
Objective. Direct stimulation of retinal ganglion cells in degenerate retinas by implanting epi-retinal prostheses is a recognized strategy for restoration of visual perception in patients with retinitis pigmentosa or age-related macular degeneration. Elucidating the best stimulus-response paradigms in the laboratory using multielectrode arrays (MEA) is complicated by the fact that the short-latency spikes (within 10 ms) elicited by direct retinal ganglion cell (RGC) stimulation are obscured by the stimulus artifact which is generated by the electrical stimulator. Approach. We developed an artifact subtraction algorithm based on topographic prominence discrimination, wherein the duration of prominences within the stimulus artifact is used as a strategy for identifying the artifact for subtraction and clarifying the obfuscated spikes which are then quantified using standard thresholding. Main results. We found that the prominence discrimination based filters perform creditably in simulation conditions by successfully isolating randomly inserted spikes in the presence of simple and even complex residual artifacts. We also show that the algorithm successfully isolated short-latency spikes in an MEA-based recording from degenerate mouse retinas, where the amplitude and frequency characteristics of the stimulus artifact vary according to the distance of the recording electrode from the stimulating electrode. By ROC analysis of false positive and false negative first spike detection rates in a dataset of one hundred and eight RGCs from four retinal patches, we found that the performance of our algorithm is comparable to that of a generally-used artifact subtraction filter algorithm which uses a strategy of local polynomial approximation (SALPA). Significance. We conclude that the application of topographic prominence discrimination is a valid and useful method for subtraction of stimulation artifacts with variable amplitudes and shapes. We propose that our algorithm may be used as stand-alone or supplementary to other artifact subtraction algorithms like SALPA.
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition. PMID:26380294
Clinical implementation of intraoperative cone-beam CT in head and neck surgery
NASA Astrophysics Data System (ADS)
Daly, M. J.; Chan, H.; Nithiananthan, S.; Qiu, J.; Barker, E.; Bachar, G.; Dixon, B. J.; Irish, J. C.; Siewerdsen, J. H.
2011-03-01
A prototype mobile C-arm for cone-beam CT (CBCT) has been translated to a prospective clinical trial in head and neck surgery. The flat-panel CBCT C-arm was developed in collaboration with Siemens Healthcare, and demonstrates both sub-mm spatial resolution and soft-tissue visibility at low radiation dose (e.g., <1/5th of a typical diagnostic head CT). CBCT images are available ~15 seconds after scan completion (~1 min acquisition) and reviewed at bedside using custom 3D visualization software based on the open-source Image-Guided Surgery Toolkit (IGSTK). The CBCT C-arm has been successfully deployed in 15 head and neck cases and streamlined into the surgical environment using human factors engineering methods and expert feedback from surgeons, nurses, and anesthetists. Intraoperative imaging is implemented in a manner that maintains operating field sterility, reduces image artifacts (e.g., carbon fiber OR table) and minimizes radiation exposure. Image reviews conducted with surgical staff indicate bony detail and soft-tissue visualization sufficient for intraoperative guidance, with additional artifact management (e.g., metal, scatter) promising further improvements. Clinical trial deployment suggests a role for intraoperative CBCT in guiding complex head and neck surgical tasks, including planning mandible and maxilla resection margins, guiding subcranial and endonasal approaches to skull base tumours, and verifying maxillofacial reconstruction alignment. Ongoing translational research into complimentary image-guidance subsystems include novel methods for real-time tool tracking, fusion of endoscopic video and CBCT, and deformable registration of preoperative volumes and planning contours with intraoperative CBCT.
Head motion during MRI acquisition reduces gray matter volume and thickness estimates.
Reuter, Martin; Tisdall, M Dylan; Qureshi, Abid; Buckner, Randy L; van der Kouwe, André J W; Fischl, Bruce
2015-02-15
Imaging biomarkers derived from magnetic resonance imaging (MRI) data are used to quantify normal development, disease, and the effects of disease-modifying therapies. However, motion during image acquisition introduces image artifacts that, in turn, affect derived markers. A systematic effect can be problematic since factors of interest like age, disease, and treatment are often correlated with both a structural change and the amount of head motion in the scanner, confounding the ability to distinguish biology from artifact. Here we evaluate the effect of head motion during image acquisition on morphometric estimates of structures in the human brain using several popular image analysis software packages (FreeSurfer 5.3, VBM8 SPM, and FSL Siena 5.0.7). Within-session repeated T1-weighted MRIs were collected on 12 healthy volunteers while performing different motion tasks, including two still scans. We show that volume and thickness estimates of the cortical gray matter are biased by head motion with an average apparent volume loss of roughly 0.7%/mm/min of subject motion. Effects vary across regions and remain significant after excluding scans that fail a rigorous quality check. In view of these results, the interpretation of reported morphometric effects of movement disorders or other conditions with increased motion tendency may need to be revisited: effects may be overestimated when not controlling for head motion. Furthermore, drug studies with hypnotic, sedative, tranquilizing, or neuromuscular-blocking substances may contain spurious "effects" of reduced atrophy or brain growth simply because they affect motion distinct from true effects of the disease or therapeutic process. Copyright © 2014 Elsevier Inc. All rights reserved.
Deblurring in digital tomosynthesis by iterative self-layer subtraction
NASA Astrophysics Data System (ADS)
Youn, Hanbean; Kim, Jee Young; Jang, SunYoung; Cho, Min Kook; Cho, Seungryong; Kim, Ho Kyung
2010-04-01
Recent developments in large-area flat-panel detectors have made tomosynthesis technology revisited in multiplanar xray imaging. However, the typical shift-and-add (SAA) or backprojection reconstruction method is notably claimed by a lack of sharpness in the reconstructed images because of blur artifact which is the superposition of objects which are out of planes. In this study, we have devised an intuitive simple method to reduce the blur artifact based on an iterative approach. This method repeats a forward and backward projection procedure to determine the blur artifact affecting on the plane-of-interest (POI), and then subtracts it from the POI. The proposed method does not include any Fourierdomain operations hence excluding the Fourier-domain-originated artifacts. We describe the concept of the self-layer subtractive tomosynthesis and demonstrate its performance with numerical simulation and experiments. Comparative analysis with the conventional methods, such as the SAA and filtered backprojection methods, is addressed.
Distinction of Fly Artifacts from Human Blood using Immunodetection.
Rivers, David B; Acca, Gillian; Fink, Marc; Brogan, Rebecca; Chen, Dorothy; Schoeffield, Andrew
2018-02-21
Insect stains produced by necrophagous flies are indistinguishable morphologically from human bloodstains. At present, no diagnostic tests exist to overcome this deficiency. As the first step toward developing a chemical test to recognize fly artifacts, polyclonal antisera were generated in rats against three distinct antigenic sequences of fly cathepsin D-like proteinase, an enzyme that is structurally distinct in cyclorrhaphous Diptera from other animals. The resulting rat antisera bound to artifacts produced by Protophormia terraenovae and synthetic peptides used to generate the polyclonal antisera, but not with any type of mammalian blood tested in immunoassays. Among the three antisera, anti-md3 serum displayed the highest reactivity for fly stains, demonstrated cross-reactivity for all synthetic peptides representing antigenic sequences of the mature fly enzyme, and bound artifacts originating from the fly digestive tract. Further work is needed to determine whether the antisera are suitable for non-laboratory conditions. © 2018 American Academy of Forensic Sciences.
Towards a Decision Support System for Space Flight Operations
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Hogle, Charles; Ruszkowski, James
2013-01-01
The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.
A forward bias method for lag correction of an a-Si flat panel detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starman, Jared; Tognina, Carlo; Partain, Larry
2012-01-15
Purpose: Digital a-Si flat panel (FP) x-ray detectors can exhibit detector lag, or residual signal, of several percent that can cause ghosting in projection images or severe shading artifacts, known as the radar artifact, in cone-beam computed tomography (CBCT) reconstructions. A major contributor to detector lag is believed to be defect states, or traps, in the a-Si layer of the FP. Software methods to characterize and correct for the detector lag exist, but they may make assumptions such as system linearity and time invariance, which may not be true. The purpose of this work is to investigate a new hardwaremore » based method to reduce lag in an a-Si FP and to evaluate its effectiveness at removing shading artifacts in CBCT reconstructions. The feasibility of a novel, partially hardware based solution is also examined. Methods: The proposed hardware solution for lag reduction requires only a minor change to the FP. For pulsed irradiation, the proposed method inserts a new operation step between the readout and data collection stages. During this new stage the photodiode is operated in a forward bias mode, which fills the defect states with charge. A Varian 4030CB panel was modified to allow for operation in the forward bias mode. The contrast of residual lag ghosts was measured for lag frames 2 and 100 after irradiation ceased for standard and forward bias modes. Detector step response, lag, SNR, modulation transfer function (MTF), and detective quantum efficiency (DQE) measurements were made with standard and forward bias firmware. CBCT data of pelvic and head phantoms were also collected. Results: Overall, the 2nd and 100th detector lag frame residual signals were reduced 70%-88% using the new method. SNR, MTF, and DQE measurements show a small decrease in collected signal and a small increase in noise. The forward bias hardware successfully reduced the radar artifact in the CBCT reconstruction of the pelvic and head phantoms by 48%-81%. Conclusions: Overall, the forward bias method has been found to greatly reduce detector lag ghosts in projection data and the radar artifact in CBCT reconstructions. The method is limited to improvements of the a-Si photodiode response only. A future hybrid mode may overcome any limitations of this method.« less
ERIC Educational Resources Information Center
Blasi, Laura; Alfonso, Berta
2006-01-01
Building and evaluating artifacts specifically for K-12 education, technologists committed to design sciences are needed along with an approach to evaluation increasing the systemic transfer from research and development into school settings. The authors describe THE VIRTUAL LAB scanning electronic microscope simulation, including (a) its…
Privileging and Artifacts: On the Use of Information Technology in Science Education
ERIC Educational Resources Information Center
Almqvist, Jonas; Ostman, Leif
2006-01-01
The aim of this paper is to develop an approach that can be used in addressing the issue of the use of information technology and its importance in human meaning making. By using a combination of Wittgenstein's work method, a sociocultural perspective on learning, and a sociotechnical perspective on artifacts a specific focus for analyses was…
ERIC Educational Resources Information Center
Borko, Hilda; Stecher, Brian; Kuffner, Karin
2007-01-01
This document includes the final data collection and scoring tools created by the "Scoop" project, a five-year project funded through the Center for Evaluation, Standards,and Student Testing (CRESST), to develop an alternative approach for characterizing classroom practice. The goal of the project was to use artifacts and related materials to…
Optimization of a fast optical CT scanner for nPAG gel dosimetry
NASA Astrophysics Data System (ADS)
Vandecasteele, Jan; DeDeene, Yves
2009-05-01
A fast laser scanning optical CT scanner was constructed and optimized at the Ghent university. The first images acquired were contaminated with several imaging artifacts. The origins of the artifacts were investigated. Performance characteristics of different components were measured such as the laser spot size, light attenuation by the lenses and the dynamic range of the photo-detector. The need for a differential measurement using a second photo-detector was investigated. Post processing strategies to compensate for hardware related errors were developed. Drift of the laser and of the detector was negligible. Incorrectly refractive index matching was dealt with by developing an automated matching process. When scratches on the water bath and phantom container are present, these pose a post processing challenge to eliminate the resulting artifacts from the reconstructed images Secondary laser spots due to multiple reflections need to be further investigated. The time delay in the control of the galvanometer and detector was dealt with using black strips that serve as markers of the projection position. Still some residual ringing artifacts are present. Several small volumetric test phantoms were constructed to obtain an overall picture of the accuracy.
High resolution 4-D spectroscopy with sparse concentric shell sampling and FFT-CLEAN.
Coggins, Brian E; Zhou, Pei
2008-12-01
Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise.
High Resolution 4-D Spectroscopy with Sparse Concentric Shell Sampling and FFT-CLEAN
Coggins, Brian E.; Zhou, Pei
2009-01-01
SUMMARY Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise. PMID:18853260
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
Rantalainen, Timo; Chivers, Paola; Beck, Belinda R; Robertson, Sam; Hart, Nicolas H; Nimphius, Sophia; Weeks, Benjamin K; McIntyre, Fleur; Hands, Beth; Siafarikas, Aris
Most imaging methods, including peripheral quantitative computed tomography (pQCT), are susceptible to motion artifacts particularly in fidgety pediatric populations. Methods currently used to address motion artifact include manual screening (visual inspection) and objective assessments of the scans. However, previously reported objective methods either cannot be applied on the reconstructed image or have not been tested for distal bone sites. Therefore, the purpose of the present study was to develop and validate motion artifact classifiers to quantify motion artifact in pQCT scans. Whether textural features could provide adequate motion artifact classification performance in 2 adolescent datasets with pQCT scans from tibial and radial diaphyses and epiphyses was tested. The first dataset was split into training (66% of sample) and validation (33% of sample) datasets. Visual classification was used as the ground truth. Moderate to substantial classification performance (J48 classifier, kappa coefficients from 0.57 to 0.80) was observed in the validation dataset with the novel texture-based classifier. In applying the same classifier to the second cross-sectional dataset, a slight-to-fair (κ = 0.01-0.39) classification performance was observed. Overall, this novel textural analysis-based classifier provided a moderate-to-substantial classification of motion artifact when the classifier was specifically trained for the measurement device and population. Classification based on textural features may be used to prescreen obviously acceptable and unacceptable scans, with a subsequent human-operated visual classification of any remaining scans. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
Reference geometry-based detection of (4D-)CT motion artifacts: a feasibility study
NASA Astrophysics Data System (ADS)
Werner, René; Gauer, Tobias
2015-03-01
Respiration-correlated computed tomography (4D or 3D+t CT) can be considered as standard of care in radiation therapy treatment planning for lung and liver lesions. The decision about an application of motion management devices and the estimation of patient-specific motion effects on the dose distribution relies on precise motion assessment in the planning 4D CT data { which is impeded in case of CT motion artifacts. The development of image-based/post-processing approaches to reduce motion artifacts would benefit from precise detection and localization of the artifacts. Simple slice-by-slice comparison of intensity values and threshold-based analysis of related metrics suffer from- depending on the threshold- high false-positive or -negative rates. In this work, we propose exploiting prior knowledge about `ideal' (= artifact free) reference geometries to stabilize metric-based artifact detection by transferring (multi-)atlas-based concepts to this specific task. Two variants are introduced and evaluated: (S1) analysis and comparison of warped atlas data obtained by repeated non-linear atlas-to-patient registration with different levels of regularization; (S2) direct analysis of vector field properties (divergence, curl magnitude) of the atlas-to-patient transformation. Feasibility of approaches (S1) and (S2) is evaluated by motion-phantom data and intra-subject experiments (four patients) as well as - adopting a multi-atlas strategy- inter-subject investigations (twelve patients involved). It is demonstrated that especially sorting/double structure artifacts can be precisely detected and localized by (S1). In contrast, (S2) suffers from high false positive rates.
NASA Astrophysics Data System (ADS)
O'Shea, Daniel J.; Shenoy, Krishna V.
2018-04-01
Objective. Electrical stimulation is a widely used and effective tool in systems neuroscience, neural prosthetics, and clinical neurostimulation. However, electrical artifacts evoked by stimulation prevent the detection of spiking activity on nearby recording electrodes, which obscures the neural population response evoked by stimulation. We sought to develop a method to clean artifact-corrupted electrode signals recorded on multielectrode arrays in order to recover the underlying neural spiking activity. Approach. We created an algorithm, which performs estimation and removal of array artifacts via sequential principal components regression (ERAASR). This approach leverages the similar structure of artifact transients, but not spiking activity, across simultaneously recorded channels on the array, across pulses within a train, and across trials. The ERAASR algorithm requires no special hardware, imposes no requirements on the shape of the artifact or the multielectrode array geometry, and comprises sequential application of straightforward linear methods with intuitive parameters. The approach should be readily applicable to most datasets where stimulation does not saturate the recording amplifier. Main results. The effectiveness of the algorithm is demonstrated in macaque dorsal premotor cortex using acute linear multielectrode array recordings and single electrode stimulation. Large electrical artifacts appeared on all channels during stimulation. After application of ERAASR, the cleaned signals were quiescent on channels with no spontaneous spiking activity, whereas spontaneously active channels exhibited evoked spikes which closely resembled spontaneously occurring spiking waveforms. Significance. We hope that enabling simultaneous electrical stimulation and multielectrode array recording will help elucidate the causal links between neural activity and cognition and facilitate naturalistic sensory protheses.
Bhaganagarapu, Kaushik; Jackson, Graeme D; Abbott, David F
2013-01-01
An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available.
An Automated Method for Identifying Artifact in Independent Component Analysis of Resting-State fMRI
Bhaganagarapu, Kaushik; Jackson, Graeme D.; Abbott, David F.
2013-01-01
An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available. PMID:23847511
Opportunity Landing Spot Panorama (3-D Model)
NASA Technical Reports Server (NTRS)
2004-01-01
The rocky outcrop traversed by the Mars Exploration Rover Opportunity is visible in this three-dimensional model of the rover's landing site. Opportunity has acquired close-up images along the way, and scientists are using the rover's instruments to closely examine portions of interest. The white fragments that look crumpled near the center of the image are portions of the airbags. Distant scenery is displayed on a spherical backdrop or 'billboard' for context. Artifacts near the top rim of the crater are a result of the transition between the three-dimensional model and the billboard. Portions of the terrain model lacking sufficient data appear as blank spaces or gaps, colored reddish-brown for better viewing. This image was generated using special software from NASA's Ames Research Center and a mosaic of images taken by the rover's panoramic camera.
[figure removed for brevity, see original site] Click on image for larger view The rocky outcrop traversed by the Mars Exploration Rover Opportunity is visible in this zoomed-in portion of a three-dimensional model of the rover's landing site. Opportunity has acquired close-up images along the way, and scientists are using the rover's instruments to closely examine portions of interest. The white fragments that look crumpled near the center of the image are portions of the airbags. Distant scenery is displayed on a spherical backdrop or 'billboard' for context. Artifacts near the top rim of the crater are a result of the transition between the three-dimensional model and the billboard. Portions of the terrain model lacking sufficient data appear as blank spaces or gaps, colored reddish-brown for better viewing. This image was generated using special software from NASA's Ames Research Center and a mosaic of images taken by the rover's panoramic camera.Artifacts in Sonography - Part 3.
Bönhof, Jörg A; McLaughlin, Glen
2018-06-01
As a continuation of parts 1 1 and 2 2, this article discusses artifacts as caused by insufficient temporal resolution, artifacts in color and spectral Doppler sonography, and information regarding artifacts in sonography with contrast agents. There are artifacts that occur in B-mode sonography as well as in Doppler imaging methods and sonography with contrast agents, such as slice thickness artifacts and bow artifacts, shadows, mirroring, and artifacts due to refraction that appear, for example, as double images, because they are based on the same formation mechanisms. In addition, there are artifacts specific to Doppler sonography, such as the twinkling artifact, and method-based motion artifacts, such as aliasing, the ureteric jet, and due to tissue vibration. The artifacts specific to contrast mode include echoes from usually highly reflective structures that are not contrast bubbles ("leakage"). Contrast agent can also change the transmitting signal so that even structures not containing contrast agent are echogenic ("pseudoenhancement"). While artifacts can cause problems regarding differential diagnosis, they can also be useful for determining the diagnosis. Therefore, effective use of sonography requires both profound knowledge and skilled interpretation of artifacts. © Georg Thieme Verlag KG Stuttgart · New York.
A Hitchhiker's Guide to Functional Magnetic Resonance Imaging
Soares, José M.; Magalhães, Ricardo; Moreira, Pedro S.; Sousa, Alexandre; Ganz, Edward; Sampaio, Adriana; Alves, Victor; Marques, Paulo; Sousa, Nuno
2016-01-01
Functional Magnetic Resonance Imaging (fMRI) studies have become increasingly popular both with clinicians and researchers as they are capable of providing unique insights into brain functions. However, multiple technical considerations (ranging from specifics of paradigm design to imaging artifacts, complex protocol definition, and multitude of processing and methods of analysis, as well as intrinsic methodological limitations) must be considered and addressed in order to optimize fMRI analysis and to arrive at the most accurate and grounded interpretation of the data. In practice, the researcher/clinician must choose, from many available options, the most suitable software tool for each stage of the fMRI analysis pipeline. Herein we provide a straightforward guide designed to address, for each of the major stages, the techniques, and tools involved in the process. We have developed this guide both to help those new to the technique to overcome the most critical difficulties in its use, as well as to serve as a resource for the neuroimaging community. PMID:27891073
High speed multiphoton imaging
NASA Astrophysics Data System (ADS)
Li, Yongxiao; Brustle, Anne; Gautam, Vini; Cockburn, Ian; Gillespie, Cathy; Gaus, Katharina; Lee, Woei Ming
2016-12-01
Intravital multiphoton microscopy has emerged as a powerful technique to visualize cellular processes in-vivo. Real time processes revealed through live imaging provided many opportunities to capture cellular activities in living animals. The typical parameters that determine the performance of multiphoton microscopy are speed, field of view, 3D imaging and imaging depth; many of these are important to achieving data from in-vivo. Here, we provide a full exposition of the flexible polygon mirror based high speed laser scanning multiphoton imaging system, PCI-6110 card (National Instruments) and high speed analog frame grabber card (Matrox Solios eA/XA), which allows for rapid adjustments between frame rates i.e. 5 Hz to 50 Hz with 512 × 512 pixels. Furthermore, a motion correction algorithm is also used to mitigate motion artifacts. A customized control software called Pscan 1.0 is developed for the system. This is then followed by calibration of the imaging performance of the system and a series of quantitative in-vitro and in-vivo imaging in neuronal tissues and mice.
Novel SPECT Technologies and Approaches in Cardiac Imaging
Slomka, Piotr; Hung, Guang-Uei; Germano, Guido; Berman, Daniel S.
2017-01-01
Recent novel approaches in myocardial perfusion single photon emission CT (SPECT) have been facilitated by new dedicated high-efficiency hardware with solid-state detectors and optimized collimators. New protocols include very low-dose (1 mSv) stress-only, two-position imaging to mitigate attenuation artifacts, and simultaneous dual-isotope imaging. Attenuation correction can be performed by specialized low-dose systems or by previously obtained CT coronary calcium scans. Hybrid protocols using CT angiography have been proposed. Image quality improvements have been demonstrated by novel reconstructions and motion correction. Fast SPECT acquisition facilitates dynamic flow and early function measurements. Image processing algorithms have become automated with virtually unsupervised extraction of quantitative imaging variables. This automation facilitates integration with clinical variables derived by machine learning to predict patient outcome or diagnosis. In this review, we describe new imaging protocols made possible by the new hardware developments. We also discuss several novel software approaches for the quantification and interpretation of myocardial perfusion SPECT scans. PMID:29034066
Trochesset, Denise A; Serchuk, Richard B; Colosi, Dan C
2014-03-01
Identification of unknown individuals using dental comparison is well established in the forensic setting. The identification technique can be time and resource consuming if many individuals need to be identified at once. Medical CT (MDCT) for dental profiling has had limited success, mostly due to artifact from metal-containing dental restorations and implants. The authors describe a CBCT reformatting technique that creates images, which closely approximate conventional dental images. Using a i-CAT Platinum CBCT unit and standard issue i-CAT Vision software, a protocol is developed to reproducibly and reliably reformat CBCT volumes. The reformatted images are presented with conventional digital images from the same anatomic area for comparison. The authors conclude that images derived from CBCT volumes following this protocol are similar enough to conventional dental radiographs to allow for dental forensic comparison/identification and that CBCT offers a superior option over MDCT for this purpose. © 2013 American Academy of Forensic Sciences.
WE-AB-207A-07: A Planning CT-Guided Scatter Artifact Correction Method for CBCT Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, X; Liu, T; Dong, X
Purpose: Cone beam computed tomography (CBCT) imaging is on increasing demand for high-performance image-guided radiotherapy such as online tumor delineation and dose calculation. However, the current CBCT imaging has severe scatter artifacts and its current clinical application is therefore limited to patient setup based mainly on the bony structures. This study’s purpose is to develop a CBCT artifact correction method. Methods: The proposed scatter correction method utilizes the planning CT to improve CBCT image quality. First, an image registration is used to match the planning CT with the CBCT to reduce the geometry difference between the two images. Then, themore » planning CT-based prior information is entered into the Bayesian deconvolution framework to iteratively perform a scatter artifact correction for the CBCT mages. This technique was evaluated using Catphan phantoms with multiple inserts. Contrast-to-noise ratios (CNR) and signal-to-noise ratios (SNR), and the image spatial nonuniformity (ISN) in selected volume of interests (VOIs) were calculated to assess the proposed correction method. Results: Post scatter correction, the CNR increased by a factor of 1.96, 3.22, 3.20, 3.46, 3.44, 1.97 and 1.65, and the SNR increased by a factor 1.05, 2.09, 1.71, 3.95, 2.52, 1.54 and 1.84 for the Air, PMP, LDPE, Polystryrene, Acrylic, Delrin and Teflon inserts, respectively. The ISN decreased from 21.1% to 4.7% in the corrected images. All values of CNR, SNR and ISN in the corrected CBCT image were much closer to those in the planning CT images. The results demonstrated that the proposed method reduces the relevant artifacts and recovers CT numbers. Conclusion: We have developed a novel CBCT artifact correction method based on CT image, and demonstrated that the proposed CT-guided correction method could significantly reduce scatter artifacts and improve the image quality. This method has great potential to correct CBCT images allowing its use in adaptive radiotherapy.« less
Mantini, D; Franciotti, R; Romani, G L; Pizzella, V
2008-03-01
The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.
Levitt, Joshua; Nitenson, Adam; Koyama, Suguru; Heijmans, Lonne; Curry, James; Ross, Jason T; Kamerling, Steven; Saab, Carl Y
2018-06-23
Electroencephalography (EEG) invariably contains extra-cranial artifacts that are commonly dealt with based on qualitative and subjective criteria. Failure to account for EEG artifacts compromises data interpretation. We have developed a quantitative and automated support vector machine (SVM)-based algorithm to accurately classify artifactual EEG epochs in awake rodent, canine and humans subjects. An embodiment of this method also enables the determination of 'eyes open/closed' states in human subjects. The levels of SVM accuracy for artifact classification in humans, Sprague Dawley rats and beagle dogs were 94.17%, 83.68%, and 85.37%, respectively, whereas 'eyes open/closed' states in humans were labeled with 88.60% accuracy. Each of these results was significantly higher than chance. Comparison with Existing Methods: Other existing methods, like those dependent on Independent Component Analysis, have not been tested in non-human subjects, and require full EEG montages, instead of only single channels, as this method does. We conclude that our EEG artifact detection algorithm provides a valid and practical solution to a common problem in the quantitative analysis and assessment of EEG in pre-clinical research settings across evolutionary spectra. Copyright © 2018. Published by Elsevier B.V.
Muscle and eye movement artifact removal prior to EEG source localization.
Hallez, Hans; Vergult, Anneleen; Phlypo, Ronald; Van Hese, Peter; De Clercq, Wim; D'Asseler, Yves; Van de Walle, Rik; Vanrumste, Bart; Van Paesschen, Wim; Van Huffel, Sabine; Lemahieu, Ignace
2006-01-01
Muscle and eye movement artifacts are very prominent in the ictal EEG of patients suffering from epilepsy, thus making the dipole localization of ictal activity very unreliable. Recently, two techniques (BSS-CCA and pSVD) were developed to remove those artifacts. The purpose of this study is to assess whether the removal of muscle and eye movement artifacts improves the EEG dipole source localization. We used a total of 8 EEG fragments, each from another patient, first unfiltered, then filtered by the BSS-CCA and pSVD. In both the filtered and unfiltered EEG fragments we estimated multiple dipoles using RAP-MUSIC. The resulting dipoles were subjected to a K-means clustering algorithm, to extract the most prominent cluster. We found that the removal of muscle and eye artifact results to tighter and more clear dipole clusters. Furthermore, we found that localization of the filtered EEG corresponded with the localization derived from the ictal SPECT in 7 of the 8 patients. Therefore, we can conclude that the BSS-CCA and pSVD improve localization of ictal activity, thus making the localization more reliable for the presurgical evaluation of the patient.
Symeonidou, Evangelia-Regkina; Nordin, Andrew D.; Hairston, W. David
2018-01-01
More neuroscience researchers are using scalp electroencephalography (EEG) to measure electrocortical dynamics during human locomotion and other types of movement. Motion artifacts corrupt the EEG and mask underlying neural signals of interest. The cause of motion artifacts in EEG is often attributed to electrode motion relative to the skin, but few studies have examined EEG signals under head motion. In the current study, we tested how motion artifacts are affected by the overall mass and surface area of commercially available electrodes, as well as how cable sway contributes to motion artifacts. To provide a ground-truth signal, we used a gelatin head phantom with embedded antennas broadcasting electrical signals, and recorded EEG with a commercially available electrode system. A robotic platform moved the phantom head through sinusoidal displacements at different frequencies (0–2 Hz). Results showed that a larger electrode surface area can have a small but significant effect on improving EEG signal quality during motion and that cable sway is a major contributor to motion artifacts. These results have implications in the development of future hardware for mobile brain imaging with EEG. PMID:29614020
An image-based approach to understanding the physics of MR artifacts.
Morelli, John N; Runge, Val M; Ai, Fei; Attenberger, Ulrike; Vu, Lan; Schmeets, Stuart H; Nitz, Wolfgang R; Kirsch, John E
2011-01-01
As clinical magnetic resonance (MR) imaging becomes more versatile and more complex, it is increasingly difficult to develop and maintain a thorough understanding of the physical principles that govern the changing technology. This is particularly true for practicing radiologists, whose primary obligation is to interpret clinical images and not necessarily to understand complex equations describing the underlying physics. Nevertheless, the physics of MR imaging plays an important role in clinical practice because it determines image quality, and suboptimal image quality may hinder accurate diagnosis. This article provides an image-based explanation of the physics underlying common MR imaging artifacts, offering simple solutions for remedying each type of artifact. Solutions that have emerged from recent technologic advances with which radiologists may not yet be familiar are described in detail. Types of artifacts discussed include those resulting from voluntary and involuntary patient motion, magnetic susceptibility, magnetic field inhomogeneities, gradient nonlinearity, standing waves, aliasing, chemical shift, and signal truncation. With an improved awareness and understanding of these artifacts, radiologists will be better able to modify MR imaging protocols so as to optimize clinical image quality, allowing greater confidence in diagnosis. Copyright © RSNA, 2011.
Wavelet approach to artifact noise removal from Capacitive coupled Electrocardiograph.
Lee, Seung Min; Kim, Ko Keun; Park, Kwang Suk
2008-01-01
Capacitive coupled Electrocardiography (ECG) is introduced as non-invasive measurement technology for ubiquitous health care and appliance are spread out widely. Although it has many merits, however, capacitive coupled ECG is very weak for motion artifacts for its non-skin-contact property. There are many studies for artifact problems which treats all artifact signals below 0.8Hz. In our capacitive coupled ECG measurement system, artifacts exist not only below 0.8Hz but also over than 10Hz. Therefore, artifact noise removal algorithm using wavelet method is tested to reject artifact-wandered signal from measured signals. It is observed that using power calculation each decimation step, artifact-wandered signal is removed as low frequency artifacts as high frequency artifacts. Although some original ECG signal is removed with artifact signal, we could level the signal quality for long term measure which shows the best quality ECG signals as we can get.
VarDetect: a nucleotide sequence variation exploratory tool
Ngamphiw, Chumpol; Kulawonganunchai, Supasak; Assawamakin, Anunchai; Jenwitheesuk, Ekachai; Tongsima, Sissades
2008-01-01
Background Single nucleotide polymorphisms (SNPs) are the most commonly studied units of genetic variation. The discovery of such variation may help to identify causative gene mutations in monogenic diseases and SNPs associated with predisposing genes in complex diseases. Accurate detection of SNPs requires software that can correctly interpret chromatogram signals to nucleotides. Results We present VarDetect, a stand-alone nucleotide variation exploratory tool that automatically detects nucleotide variation from fluorescence based chromatogram traces. Accurate SNP base-calling is achieved using pre-calculated peak content ratios, and is enhanced by rules which account for common sequence reading artifacts. The proposed software tool is benchmarked against four other well-known SNP discovery software tools (PolyPhred, novoSNP, Genalys and Mutation Surveyor) using fluorescence based chromatograms from 15 human genes. These chromatograms were obtained from sequencing 16 two-pooled DNA samples; a total of 32 individual DNA samples. In this comparison of automatic SNP detection tools, VarDetect achieved the highest detection efficiency. Availability VarDetect is compatible with most major operating systems such as Microsoft Windows, Linux, and Mac OSX. The current version of VarDetect is freely available at . PMID:19091032
NASA Astrophysics Data System (ADS)
Shirai, Tomohiro; Friberg, Ari T.
2018-04-01
Dispersion-canceled optical coherence tomography (OCT) based on spectral intensity interferometry was devised as a classical counterpart of quantum OCT to enhance the basic performance of conventional OCT. In this paper, we demonstrate experimentally that an alternative method of realizing this kind of OCT by means of two optical fiber couplers and a single spectrometer is a more practical and reliable option than the existing methods proposed previously. Furthermore, we develop a recipe for reducing multiple artifacts simultaneously on the basis of simple averaging and verify experimentally that it works successfully in the sense that all the artifacts are mitigated effectively and only the true signals carrying structural information about the sample survive.
Automatic Synthesis of UML Designs from Requirements in an Iterative Process
NASA Technical Reports Server (NTRS)
Schumann, Johann; Whittle, Jon; Clancy, Daniel (Technical Monitor)
2001-01-01
The Unified Modeling Language (UML) is gaining wide popularity for the design of object-oriented systems. UML combines various object-oriented graphical design notations under one common framework. A major factor for the broad acceptance of UML is that it can be conveniently used in a highly iterative, Use Case (or scenario-based) process (although the process is not a part of UML). Here, the (pre-) requirements for the software are specified rather informally as Use Cases and a set of scenarios. A scenario can be seen as an individual trace of a software artifact. Besides first sketches of a class diagram to illustrate the static system breakdown, scenarios are a favorite way of communication with the customer, because scenarios describe concrete interactions between entities and are thus easy to understand. Scenarios with a high level of detail are often expressed as sequence diagrams. Later in the design and implementation stage (elaboration and implementation phases), a design of the system's behavior is often developed as a set of statecharts. From there (and the full-fledged class diagram), actual code development is started. Current commercial UML tools support this phase by providing code generators for class diagrams and statecharts. In practice, it can be observed that the transition from requirements to design to code is a highly iterative process. In this talk, a set of algorithms is presented which perform reasonable synthesis and transformations between different UML notations (sequence diagrams, Object Constraint Language (OCL) constraints, statecharts). More specifically, we will discuss the following transformations: Statechart synthesis, introduction of hierarchy, consistency of modifications, and "design-debugging".
Establishing imaging sensor specifications for digital still cameras
NASA Astrophysics Data System (ADS)
Kriss, Michael A.
2007-02-01
Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.
Methodological aspects of EEG and body dynamics measurements during motion
Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias
2014-01-01
EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858
Olivieri, Laura J; Cross, Russell R; O'Brien, Kendall E; Ratnayaka, Kanishka; Hansen, Michael S
2015-09-01
Cardiac magnetic resonance (MR) imaging is a valuable tool in congenital heart disease; however patients frequently have metal devices in the chest from the treatment of their disease that complicate imaging. Methods are needed to improve imaging around metal implants near the heart. Basic sequence parameter manipulations have the potential to minimize artifact while limiting effects on image resolution and quality. Our objective was to design cine and static cardiac imaging sequences to minimize metal artifact while maintaining image quality. Using systematic variation of standard imaging parameters on a fluid-filled phantom containing commonly used metal cardiac devices, we developed optimized sequences for steady-state free precession (SSFP), gradient recalled echo (GRE) cine imaging, and turbo spin-echo (TSE) black-blood imaging. We imaged 17 consecutive patients undergoing routine cardiac MR with 25 metal implants of various origins using both standard and optimized imaging protocols for a given slice position. We rated images for quality and metal artifact size by measuring metal artifact in two orthogonal planes within the image. All metal artifacts were reduced with optimized imaging. The average metal artifact reduction for the optimized SSFP cine was 1.5+/-1.8 mm, and for the optimized GRE cine the reduction was 4.6+/-4.5 mm (P < 0.05). Quality ratings favored the optimized GRE cine. Similarly, the average metal artifact reduction for the optimized TSE images was 1.6+/-1.7 mm (P < 0.05), and quality ratings favored the optimized TSE imaging. Imaging sequences tailored to minimize metal artifact are easily created by modifying basic sequence parameters, and images are superior to standard imaging sequences in both quality and artifact size. Specifically, for optimized cine imaging a GRE sequence should be used with settings that favor short echo time, i.e. flow compensation off, weak asymmetrical echo and a relatively high receiver bandwidth. For static black-blood imaging, a TSE sequence should be used with fat saturation turned off and high receiver bandwidth.
Veldkamp, Wouter J H; Joemai, Raoul M S; van der Molen, Aart J; Geleijns, Jacob
2010-02-01
Metal prostheses cause artifacts in computed tomography (CT) images. The purpose of this work was to design an efficient and accurate metal segmentation in raw data to achieve artifact suppression and to improve CT image quality for patients with metal hip or shoulder prostheses. The artifact suppression technique incorporates two steps: metal object segmentation in raw data and replacement of the segmented region by new values using an interpolation scheme, followed by addition of the scaled metal signal intensity. Segmentation of metal is performed directly in sinograms, making it efficient and different from current methods that perform segmentation in reconstructed images in combination with Radon transformations. Metal signal segmentation is achieved by using a Markov random field model (MRF). Three interpolation methods are applied and investigated. To provide a proof of concept, CT data of five patients with metal implants were included in the study, as well as CT data of a PMMA phantom with Teflon, PVC, and titanium inserts. Accuracy was determined quantitatively by comparing mean Hounsfield (HU) values and standard deviation (SD) as a measure of distortion in phantom images with titanium (original and suppressed) and without titanium insert. Qualitative improvement was assessed by comparing uncorrected clinical images with artifact suppressed images. Artifacts in CT data of a phantom and five patients were automatically suppressed. The general visibility of structures clearly improved. In phantom images, the technique showed reduced SD close to the SD for the case where titanium was not inserted, indicating improved image quality. HU values in corrected images were different from expected values for all interpolation methods. Subtle differences between interpolation methods were found. The new artifact suppression design is efficient, for instance, in terms of preserving spatial resolution, as it is applied directly to original raw data. It successfully reduced artifacts in CT images of five patients and in phantom images. Sophisticated interpolation methods are needed to obtain reliable HU values close to the prosthesis.
Kim, Ki Hwan; Park, Sung-Hong
2017-04-01
The balanced steady-state free precession (bSSFP) MR sequence is frequently used in clinics, but is sensitive to off-resonance effects, which can cause banding artifacts. Often multiple bSSFP datasets are acquired at different phase cycling (PC) angles and then combined in a special way for banding artifact suppression. Many strategies of combining the datasets have been suggested for banding artifact suppression, but there are still limitations in their performance, especially when the number of phase-cycled bSSFP datasets is small. The purpose of this study is to develop a learning-based model to combine the multiple phase-cycled bSSFP datasets for better banding artifact suppression. Multilayer perceptron (MLP) is a feedforward artificial neural network consisting of three layers of input, hidden, and output layers. MLP models were trained by input bSSFP datasets acquired from human brain and knee at 3T, which were separately performed for two and four PC angles. Banding-free bSSFP images were generated by maximum-intensity projection (MIP) of 8 or 12 phase-cycled datasets and were used as targets for training the output layer. The trained MLP models were applied to another brain and knee datasets acquired with different scan parameters and also to multiple phase-cycled bSSFP functional MRI datasets acquired on rat brain at 9.4T, in comparison with the conventional MIP method. Simulations were also performed to validate the MLP approach. Both the simulations and human experiments demonstrated that MLP suppressed banding artifacts significantly, superior to MIP in both banding artifact suppression and SNR efficiency. MLP demonstrated superior performance over MIP for the 9.4T fMRI data as well, which was not used for training the models, while visually preserving the fMRI maps very well. Artificial neural network is a promising technique for combining multiple phase-cycled bSSFP datasets for banding artifact suppression. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, M; currently at University of Toronto, Sunnybrook Health Sciences Center, Toronto, ON; MacKenzie, M
Purpose: To evaluate the metal artifacts in diagnostic kVCT images of patients that are corrected using a normalized metal artifact reduction method with MVCT prior images, MVCT-NMAR. Methods: An MVCTNMAR algorithm was developed and applied to five patients: three with bilateral hip prostheses, one with unilateral hip prosthesis and one with dental fillings. The corrected images were evaluated for visualization of tissue structures and their interfaces, and for radiotherapy dose calculations. They were also compared against the corresponding images corrected by a commercial metal artifact reduction technique, O-MAR, on a Phillips™ CT scanner. Results: The use of MVCT images formore » correcting kVCT images in the MVCT-NMAR technique greatly reduces metal artifacts, avoids secondary artifacts, and makes patient images more useful for correct dose calculation in radiotherapy. These improvements are significant over the commercial correction method, provided the MVCT and kVCT images are correctly registered. The remaining and the secondary artifacts (soft tissue blurring, eroded bones, false bones or air pockets, CT number cupping within the metal) present in O-MAR corrected images are removed in the MVCT-NMAR corrected images. Large dose reduction is possible outside the planning target volume (e.g., 59.2 Gy in comparison to 52.5 Gy in pubic bone) when these MVCT-NMAR corrected images are used in TomoTherapy™ treatment plans, as the corrected images no longer require directional blocks for prostate plans in order to avoid the image artifact regions. Conclusion: The use of MVCT-NMAR corrected images in radiotherapy treatment planning could improve the treatment plan quality for cancer patients with metallic implants. Moti Raj Paudel is supported by the Vanier Canada Graduate Scholarship, the Endowed Graduate Scholarship in Oncology and the Dissertation Fellowship at the University of Alberta. The authors acknowledge the CIHR operating grant number MOP 53254.« less
Griffin, John F; Archambault, Nicholas S; Mankin, Joseph M; Wall, Corey R; Thompson, James A; Padua, Abraham; Purdy, David; Kerwin, Sharon C
2013-11-15
Laboratory investigation, ex vivo. Postoperative complications are common after spinal implantation procedures, and magnetic resonance imaging (MRI) would be the ideal modality to image these patients. Unfortunately, the implants cause artifacts that can render MRI nondiagnostic. The WARP-turbo spin echo (TSE) sequence has been developed to mitigate artifacts caused by metal. The objective of this investigation was to evaluate the performance of the WARP-TSE sequence in canine cadaver specimens after implantation with metallic vertebral implants. Magnetic field strength, implant type, and MRI acquisition technique all play a role in the severity of susceptibility artifacts. The WARP-TSE sequence uses increased bandwidth, view angle tilting, and SEMAC (slice-encoding metal artifact correction) to correct for susceptibility artifact. The WARP-TSE technique has outperformed conventional techniques in patients, after total hip arthroplasty. However, published reports of its application in subjects with vertebral column implants are lacking. Ex vivo anterior stabilization of the atlantoaxial joint was performed on 6 adult small breed (<8 kg) cadaver dogs using stainless steel screws and polymethylmethacrylate. Axial and sagittal T2-weighted and short tau inversion recovery MRI was performed using conventional pulse sequences and WARP-TSE sequences at 3 T. Images were assessed qualitatively and quantitatively. Images made with the WARP-TSE sequence had smaller susceptibility artifacts and superior spinal cord margin depiction. WARP-TSE sequences reduced the length over which susceptibility artifacts caused spinal cord margin depiction interference by 24.9% to 71.5% with scan times of approximately 12 to 16 minutes. The WARP-TSE sequence is a viable option for evaluating the vertebral column after implantation with stainless steel implants. N/A.
WE-G-18A-03: Cone Artifacts Correction in Iterative Cone Beam CT Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, H; Folkerts, M; Jiang, S
Purpose: For iterative reconstruction (IR) in cone-beam CT (CBCT) imaging, data truncation along the superior-inferior (SI) direction causes severe cone artifacts in the reconstructed CBCT volume images. Not only does it reduce the effective SI coverage of the reconstructed volume, it also hinders the IR algorithm convergence. This is particular a problem for regularization based IR, where smoothing type regularization operations tend to propagate the artifacts to a large area. It is our purpose to develop a practical cone artifacts correction solution. Methods: We found it is the missing data residing in the truncated cone area that leads to inconsistencymore » between the calculated forward projections and measured projections. We overcome this problem by using FDK type reconstruction to estimate the missing data and design weighting factors to compensate the inconsistency caused by the missing data. We validate the proposed methods in our multi-GPU low-dose CBCT reconstruction system on multiple patients' datasets. Results: Compared to the FDK reconstruction with full datasets, while IR is able to reconstruct CBCT images using a subset of projection data, the severe cone artifacts degrade overall image quality. For head-neck case under a full-fan mode, 13 out of 80 slices are contaminated. It is even more severe in pelvis case under half-fan mode, where 36 out of 80 slices are affected, leading to inferior soft-tissue delineation. By applying the proposed method, the cone artifacts are effectively corrected, with a mean intensity difference decreased from ∼497 HU to ∼39HU for those contaminated slices. Conclusion: A practical and effective solution for cone artifacts correction is proposed and validated in CBCT IR algorithm. This study is supported in part by NIH (1R01CA154747-01)« less
Improving image quality in laboratory x-ray phase-contrast imaging
NASA Astrophysics Data System (ADS)
De Marco, F.; Marschner, M.; Birnbacher, L.; Viermetz, M.; Noël, P.; Herzen, J.; Pfeiffer, F.
2017-03-01
Grating-based X-ray phase-contrast (gbPC) is known to provide significant benefits for biomedical imaging. To investigate these benefits, a high-sensitivity gbPC micro-CT setup for small (≍ 5 cm) biological samples has been constructed. Unfortunately, high differential-phase sensitivity leads to an increased magnitude of data processing artifacts, limiting the quality of tomographic reconstructions. Most importantly, processing of phase-stepping data with incorrect stepping positions can introduce artifacts resembling Moiré fringes to the projections. Additionally, the focal spot size of the X-ray source limits resolution of tomograms. Here we present a set of algorithms to minimize artifacts, increase resolution and improve visual impression of projections and tomograms from the examined setup. We assessed two algorithms for artifact reduction: Firstly, a correction algorithm exploiting correlations of the artifacts and differential-phase data was developed and tested. Artifacts were reliably removed without compromising image data. Secondly, we implemented a new algorithm for flatfield selection, which was shown to exclude flat-fields with strong artifacts. Both procedures successfully improved image quality of projections and tomograms. Deconvolution of all projections of a CT scan can minimize blurring introduced by the finite size of the X-ray source focal spot. Application of the Richardson-Lucy deconvolution algorithm to gbPC-CT projections resulted in an improved resolution of phase-contrast tomograms. Additionally, we found that nearest-neighbor interpolation of projections can improve the visual impression of very small features in phase-contrast tomograms. In conclusion, we achieved an increase in image resolution and quality for the investigated setup, which may lead to an improved detection of very small sample features, thereby maximizing the setup's utility.
Lommen, Arjen
2009-04-15
Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.
A low-cost universal cumulative gating circuit for small and large animal clinical imaging
NASA Astrophysics Data System (ADS)
Gioux, Sylvain; Frangioni, John V.
2008-02-01
Image-assisted diagnosis and therapy is becoming more commonplace in medicine. However, most imaging techniques suffer from voluntary or involuntary motion artifacts, especially cardiac and respiratory motions, which degrade image quality. Current software solutions either induce computational overhead or reject out-of-focus images after acquisition. In this study we demonstrate a hardware-only gating circuit that accepts multiple, pseudo-periodic signals and produces a single TTL (0-5 V) imaging window of accurate phase and period. The electronic circuit Gerber files described in this article and the list of components are available online at www.frangionilab.org.
Zirconia-Polyurethane Aneurysm Clip.
Cho, Won-Sang; Cho, Kyung-Il; Kim, Jeong Eun; Jang, Tae-Sik; Ha, Eun Jin; Kang, Hyun-Seung; Son, Young-Je; Choi, Seung Hong; Lee, Seunghyun; Kim, Chong-Chan; Sun, Jeong-Yun; Kim, Hyoun-Ee
2018-03-27
Susceptibility artifacts from metal clips in magnetic resonance (MR) imaging present an obstacle to evaluating the status of clipped aneurysms, parent arteries and adjacent brain parenchyma. We aimed to develop MR-compatible aneurysm clips. Considering the mechanical and biological properties, and MR compatibility of candidate materials, a prototype clip with a zirconia body and a polyurethane head spring (ZC, straight, 9-mm long) was developed. The closing forces, opening width of blades, and in vitro and in vivo artifact volumes in 3 tesla MR imaging were compared among the prototype and commercial metal clips such as a Yasargil ® clip (YC, curved type, 8.3-mm long) and a Sugita ® clip (SC, straight type, 10-mm long). An in vivo animal study was performed with a canine venous pouch aneurysm model. The closing forces (N) at 1 and 8 mm from the blade tip were 2.09 and 3.77 in YC, 1.85 and 3.04 in SC, and 2.05 and 4.60 in ZC. The maximum opening widths (mm) was 6.8, 9.0, and 3.0 in YC, SC, and ZC, respectively. The in vitro artifact volumes of YC, SC and ZC in time-of-flight MR imaging were 26.9, 29.7 and 1.9 times larger than the respective real volumes. The in vivo artifact volumes of YC, SC, and ZC were respectively 21.4, 29.4, and 2.6 times larger than real ones. ZC showed the smallest susceptibility artifacts and satisfactory closing forces. However, the narrow opening width of the blades was a weak point. Copyright © 2018. Published by Elsevier Inc.
Automating U-Pb IDTIMS data reduction and reporting: Cyberinfrastructure meets geochronology
NASA Astrophysics Data System (ADS)
Bowring, J. F.; McLean, N.; Walker, J. D.; Ash, J. M.
2009-12-01
We demonstrate the efficacy of an interdisciplinary effort between software engineers and geochemists to produce working cyberinfrastructure for geochronology. This collaboration between CIRDLES, EARTHTIME and EarthChem has produced the software programs Tripoli and U-Pb_Redux as the cyber-backbone for the ID-TIMS community. This initiative incorporates shared isotopic tracers, data-reduction algorithms and the archiving and retrieval of data and results. The resulting system facilitates detailed inter-laboratory comparison and a new generation of cooperative science. The resolving power of geochronological data in the earth sciences is dependent on the precision and accuracy of many isotopic measurements and corrections. Recent advances in U-Pb geochronology have reinvigorated its application to problems such as precise timescale calibration, processes of crustal evolution, and early solar system dynamics. This project provides a heretofore missing common data reduction protocol, thus promoting the interpretation of precise geochronology and enabling inter-laboratory comparison. U-Pb_Redux is an open-source software program that provides end-to-end support for the analysis of uranium-lead geochronological data. The system reduces raw mass spectrometer data to U-Pb dates, allows users to interpret ages from these data, and then provides for the seamless federation of the results, coming from many labs, into a community web-accessible database using standard and open techniques. This EarthChem GeoChron database depends also on keyed references to the SESAR sample database. U-Pb_Redux currently provides interactive concordia and weighted mean plots and uncertainty contribution visualizations; it produces publication-quality concordia and weighted mean plots and customizable data tables. This initiative has achieved the goal of standardizing the data elements of a complete reduction and analysis of uranium-lead data, which are expressed using extensible markup language schema definition (XSD) artifacts. U-Pb_Redux leverages the freeware program Tripoli, which imports raw mass spectrometer data files and supports interactive review and archiving of isotopic data. Tripoli facilitates the visualization of temporal trends and scatter during measurement, statistically rigorous filtering of data and supports oxide and fractionation corrections. The Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES) collaboratively integrates domain-specific software engineering with the efforts EARTHTIME and Earthchem. The EARTHTIME initiative pursues consensus-based approaches to geochemical data reduction, and the EarthChem initiative pursues the creation of data repositories for all geochemical data. CIRDLES develops software and systems for geochronology. This collaboration benefits the earth sciences by enabling geochemists to focus on their specialties using robust software that produces reliable results. This collaboration benefits software engineering by providing research opportunities to improve process methodologies used in the design and implementation of domain-specific solutions.
Artifacts for Calibration of Submicron Width Measurements
NASA Technical Reports Server (NTRS)
Grunthaner, Frank; Grunthaner, Paula; Bryson, Charles, III
2003-01-01
Artifacts that are fabricated with the help of molecular-beam epitaxy (MBE) are undergoing development for use as dimensional calibration standards with submicron widths. Such standards are needed for calibrating instruments (principally, scanning electron microscopes and scanning probe microscopes) for measuring the widths of features in advanced integrated circuits. Dimensional calibration standards fabricated by an older process that involves lithography and etching of trenches in (110) surfaces of single-crystal silicon are generally reproducible to within dimensional tolerances of about 15 nm. It is anticipated that when the artifacts of the present type are fully developed, their critical dimensions will be reproducible to within 1 nm. These artifacts are expected to find increasing use in the semiconductor-device and integrated- circuit industries as the width tolerances on semiconductor devices shrink to a few nanometers during the next few years. Unlike in the older process, one does not rely on lithography and etching to define the critical dimensions. Instead, one relies on the inherent smoothness and flatness of MBE layers deposited under controlled conditions and defines the critical dimensions as the thicknesses of such layers. An artifact of the present type is fabricated in two stages (see figure): In the first stage, a multilayer epitaxial wafer is grown on a very flat substrate. In the second stage, the wafer is cleaved to expose the layers, then the exposed layers are differentially etched (taking advantage of large differences between the etch rates of the different epitaxial layer materials). The resulting structure includes narrow and well-defined trenches and a shelf with thicknesses determined by the thicknesses of the epitaxial layers from which they were etched. Eventually, it should be possible to add a third fabrication stage in which durable, electronically inert artifacts could be replicated in diamondlike carbon from a master made by MBE and etching as described above.
NASA Astrophysics Data System (ADS)
Tang, Xiangyang
2003-05-01
In multi-slice helical CT, the single-tilted-plane-based reconstruction algorithm has been proposed to combat helical and cone beam artifacts by tilting a reconstruction plane to fit a helical source trajectory optimally. Furthermore, to improve the noise characteristics or dose efficiency of the single-tilted-plane-based reconstruction algorithm, the multi-tilted-plane-based reconstruction algorithm has been proposed, in which the reconstruction plane deviates from the pose globally optimized due to an extra rotation along the 3rd axis. As a result, the capability of suppressing helical and cone beam artifacts in the multi-tilted-plane-based reconstruction algorithm is compromised. An optomized tilted-plane-based reconstruction algorithm is proposed in this paper, in which a matched view weighting strategy is proposed to optimize the capability of suppressing helical and cone beam artifacts and noise characteristics. A helical body phantom is employed to quantitatively evaluate the imaging performance of the matched view weighting approach by tabulating artifact index and noise characteristics, showing that the matched view weighting improves both the helical artifact suppression and noise characteristics or dose efficiency significantly in comparison to the case in which non-matched view weighting is applied. Finally, it is believed that the matched view weighting approach is of practical importance in the development of multi-slive helical CT, because it maintains the computational structure of fan beam filtered backprojection and demands no extra computational services.
Sun, Bo; Koh, Yee Kan
2016-06-01
Time-domain thermoreflectance (TDTR) is a pump-probe technique frequently applied to measure the thermal transport properties of bulk materials, nanostructures, and interfaces. One of the limitations of TDTR is that it can only be employed to samples with a fairly smooth surface. For rough samples, artifact signals are collected when the pump beam in TDTR measurements is diffusely scattered by the rough surface into the photodetector, rendering the TDTR measurements invalid. In this paper, we systemically studied the factors affecting the artifact signals due to the pump beam leaked into the photodetector and thus established the origin of the artifact signals. We find that signals from the leaked pump beam are modulated by the probe beam due to the phase rotation induced in the photodetector by the illumination of the probe beam. As a result of the modulation, artifact signals due to the leaked pump beam are registered in TDTR measurements as the out-of-phase signals. We then developed a simple approach to eliminate the artifact signals due to the leaked pump beam. We verify our leak-pump correction approach by measuring the thermal conductivity of a rough InN sample, when the signals from the leaked pump beam are significant. We also discuss the advantages of our new method over the two-tint approach and its limitations. Our new approach enables measurements of the thermal conductivity of rough samples using TDTR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Bo; Koh, Yee Kan, E-mail: mpekyk@nus.edu.sg; Centre of Advanced 2D Materials, National University of Singapore, Singapore 117542
Time-domain thermoreflectance (TDTR) is a pump-probe technique frequently applied to measure the thermal transport properties of bulk materials, nanostructures, and interfaces. One of the limitations of TDTR is that it can only be employed to samples with a fairly smooth surface. For rough samples, artifact signals are collected when the pump beam in TDTR measurements is diffusely scattered by the rough surface into the photodetector, rendering the TDTR measurements invalid. In this paper, we systemically studied the factors affecting the artifact signals due to the pump beam leaked into the photodetector and thus established the origin of the artifact signals.more » We find that signals from the leaked pump beam are modulated by the probe beam due to the phase rotation induced in the photodetector by the illumination of the probe beam. As a result of the modulation, artifact signals due to the leaked pump beam are registered in TDTR measurements as the out-of-phase signals. We then developed a simple approach to eliminate the artifact signals due to the leaked pump beam. We verify our leak-pump correction approach by measuring the thermal conductivity of a rough InN sample, when the signals from the leaked pump beam are significant. We also discuss the advantages of our new method over the two-tint approach and its limitations. Our new approach enables measurements of the thermal conductivity of rough samples using TDTR.« less
A personal sampler for aircraft engine cold start particles: laboratory development and testing.
Armendariz, Alfredo; Leith, David
2003-01-01
Industrial hygienists in the U.S. Air Force are concerned about exposure of their personnel to jet fuel. One potential source of exposure for flightline ground crews is the plume emitted during the start of aircraft engines in extremely cold weather. The purpose of this study was to investigate a personal sampler, a small tube-and-wire electrostatic precipitator (ESP), for assessing exposure to aircraft engine cold start particles. Tests were performed in the laboratory to characterize the sampler's collection efficiency and to determine the magnitude of adsorption and evaporation artifacts. A low-temperature chamber was developed for the artifact experiments so tests could be performed at temperatures similar to actual field conditions. The ESP collected particles from 0.5 to 20 micro m diameter with greater than 98% efficiency at particle concentrations up to 100 mg/m(3). Adsorption artifacts were less than 5 micro g/m(3) when sampling a high concentration vapor stream. Evaporation artifacts were significantly lower for the ESP than for PVC membrane filters across a range of sampling times and incoming vapor concentrations. These tests indicate that the ESP provides more accurate exposure assessment results than traditional filter-based particle samplers when sampling cold start particles produced by an aircraft engine.
Applications of Phase-Based Motion Processing
NASA Technical Reports Server (NTRS)
Branch, Nicholas A.; Stewart, Eric C.
2018-01-01
Image pyramids provide useful information in determining structural response at low cost using commercially available cameras. The current effort applies previous work on the complex steerable pyramid to analyze and identify imperceptible linear motions in video. Instead of implicitly computing motion spectra through phase analysis of the complex steerable pyramid and magnifying the associated motions, instead present a visual technique and the necessary software to display the phase changes of high frequency signals within video. The present technique quickly identifies regions of largest motion within a video with a single phase visualization and without the artifacts of motion magnification, but requires use of the computationally intensive Fourier transform. While Riesz pyramids present an alternative to the computationally intensive complex steerable pyramid for motion magnification, the Riesz formulation contains significant noise, and motion magnification still presents large amounts of data that cannot be quickly assessed by the human eye. Thus, user-friendly software is presented for quickly identifying structural response through optical flow and phase visualization in both Python and MATLAB.
Zou, Yuan; Nathan, Viswam; Jafari, Roozbeh
2016-01-01
Electroencephalography (EEG) is the recording of electrical activity produced by the firing of neurons within the brain. These activities can be decoded by signal processing techniques. However, EEG recordings are always contaminated with artifacts which hinder the decoding process. Therefore, identifying and removing artifacts is an important step. Researchers often clean EEG recordings with assistance from independent component analysis (ICA), since it can decompose EEG recordings into a number of artifact-related and event-related potential (ERP)-related independent components. However, existing ICA-based artifact identification strategies mostly restrict themselves to a subset of artifacts, e.g., identifying eye movement artifacts only, and have not been shown to reliably identify artifacts caused by nonbiological origins like high-impedance electrodes. In this paper, we propose an automatic algorithm for the identification of general artifacts. The proposed algorithm consists of two parts: 1) an event-related feature-based clustering algorithm used to identify artifacts which have physiological origins; and 2) the electrode-scalp impedance information employed for identifying nonbiological artifacts. The results on EEG data collected from ten subjects show that our algorithm can effectively detect, separate, and remove both physiological and nonbiological artifacts. Qualitative evaluation of the reconstructed EEG signals demonstrates that our proposed method can effectively enhance the signal quality, especially the quality of ERPs, even for those that barely display ERPs in the raw EEG. The performance results also show that our proposed method can effectively identify artifacts and subsequently enhance the classification accuracies compared to four commonly used automatic artifact removal methods.
Zou, Yuan; Nathan, Viswam; Jafari, Roozbeh
2017-01-01
Electroencephalography (EEG) is the recording of electrical activity produced by the firing of neurons within the brain. These activities can be decoded by signal processing techniques. However, EEG recordings are always contaminated with artifacts which hinder the decoding process. Therefore, identifying and removing artifacts is an important step. Researchers often clean EEG recordings with assistance from Independent Component Analysis (ICA), since it can decompose EEG recordings into a number of artifact-related and event related potential (ERP)-related independent components (ICs). However, existing ICA-based artifact identification strategies mostly restrict themselves to a subset of artifacts, e.g. identifying eye movement artifacts only, and have not been shown to reliably identify artifacts caused by non-biological origins like high-impedance electrodes. In this paper, we propose an automatic algorithm for the identification of general artifacts. The proposed algorithm consists of two parts: 1) an event-related feature based clustering algorithm used to identify artifacts which have physiological origins and 2) the electrode-scalp impedance information employed for identifying non-biological artifacts. The results on EEG data collected from 10 subjects show that our algorithm can effectively detect, separate, and remove both physiological and non-biological artifacts. Qualitative evaluation of the reconstructed EEG signals demonstrates that our proposed method can effectively enhance the signal quality, especially the quality of ERPs, even for those that barely display ERPs in the raw EEG. The performance results also show that our proposed method can effectively identify artifacts and subsequently enhance the classification accuracies compared to four commonly used automatic artifact removal methods. PMID:25415992
Kiser, Patti K; Löhr, Christiane V; Meritet, Danielle; Spagnoli, Sean T; Milovancev, Milan; Russell, Duncan S
2018-05-01
Although quantitative assessment of margins is recommended for describing excision of cutaneous malignancies, there is poor understanding of limitations associated with this technique. We described and quantified histologic artifacts in inked margins and determined the association between artifacts and variance in histologic tumor-free margin (HTFM) measurements based on a novel grading scheme applied to 50 sections of normal canine skin and 56 radial margins taken from 15 different canine mast cell tumors (MCTs). Three broad categories of artifact were 1) tissue deformation at inked edges, 2) ink-associated artifacts, and 3) sectioning-associated artifacts. The most common artifacts in MCT margins were ink-associated artifacts, specifically ink absent from an edge (mean prevalence: 50%) and inappropriate ink coloring (mean: 45%). The prevalence of other artifacts in MCT skin was 4-50%. In MCT margins, frequency-adjusted kappa statistics found fair or better inter-rater reliability for 9 of 10 artifacts; intra-rater reliability was moderate or better in 9 of 10 artifacts. Digital HTFM measurements by 5 blinded pathologists had a median standard deviation (SD) of 1.9 mm (interquartile range: 0.8-3.6 mm; range: 0-6.2 mm). Intraclass correlation coefficients demonstrated good inter-pathologist reliability in HTFM measurement (κ = 0.81). Spearman rank correlation coefficients found negligible correlation between artifacts and HTFM SDs ( r ≤ 0.3). These data confirm that although histologic artifacts commonly occur in inked margin specimens, artifacts are not meaningfully associated with variation in HTFM measurements. Investigators can use the grading scheme presented herein to identify artifacts associated with tissue processing.
Model-assisted development of a laminography inspection system
NASA Astrophysics Data System (ADS)
Grandin, R.; Gray, J.
2012-05-01
Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.
Fiore, Stephen M.; Wiltshire, Travis J.
2016-01-01
In this paper we advance team theory by describing how cognition occurs across the distribution of members and the artifacts and technology that support their efforts. We draw from complementary theorizing coming out of cognitive engineering and cognitive science that views forms of cognition as external and extended and integrate this with theorizing on macrocognition in teams. Two frameworks are described that provide the groundwork for advancing theory and aid in the development of more precise measures for understanding team cognition via focus on artifacts and the technologies supporting their development and use. This includes distinctions between teamwork and taskwork and the notion of general and specific competencies from the organizational sciences along with the concepts of offloading and scaffolding from the cognitive sciences. This paper contributes to the team cognition literature along multiple lines. First, it aids theory development by synthesizing a broad set of perspectives on the varied forms of cognition emerging in complex collaborative contexts. Second, it supports research by providing diagnostic guidelines to study how artifacts are related to team cognition. Finally, it supports information systems designers by more precisely describing how to conceptualize team-supporting technology and artifacts. As such, it provides a means to more richly understand process and performance as it occurs within sociotechnical systems. Our overarching objective is to show how team cognition can both be more clearly conceptualized and more precisely measured by integrating theory from cognitive engineering and the cognitive and organizational sciences. PMID:27774074
Fiore, Stephen M; Wiltshire, Travis J
2016-01-01
In this paper we advance team theory by describing how cognition occurs across the distribution of members and the artifacts and technology that support their efforts. We draw from complementary theorizing coming out of cognitive engineering and cognitive science that views forms of cognition as external and extended and integrate this with theorizing on macrocognition in teams. Two frameworks are described that provide the groundwork for advancing theory and aid in the development of more precise measures for understanding team cognition via focus on artifacts and the technologies supporting their development and use. This includes distinctions between teamwork and taskwork and the notion of general and specific competencies from the organizational sciences along with the concepts of offloading and scaffolding from the cognitive sciences. This paper contributes to the team cognition literature along multiple lines. First, it aids theory development by synthesizing a broad set of perspectives on the varied forms of cognition emerging in complex collaborative contexts. Second, it supports research by providing diagnostic guidelines to study how artifacts are related to team cognition. Finally, it supports information systems designers by more precisely describing how to conceptualize team-supporting technology and artifacts. As such, it provides a means to more richly understand process and performance as it occurs within sociotechnical systems. Our overarching objective is to show how team cognition can both be more clearly conceptualized and more precisely measured by integrating theory from cognitive engineering and the cognitive and organizational sciences.
How to evaluate the microcirculation: report of a round table conference
De Backer, Daniel; Hollenberg, Steven; Boerma, Christiaan; Goedhart, Peter; Büchele, Gustavo; Ospina-Tascon, Gustavo; Dobbe, Iwan; Ince, Can
2007-01-01
Introduction Microvascular alterations may play an important role in the development of organ failure in critically ill patients and especially in sepsis. Recent advances in technology have allowed visualization of the microcirculation, but several scoring systems have been used so it is sometimes difficult to compare studies. This paper reports the results of a round table conference that was organized in Amsterdam in November 2006 in order to achieve consensus on image acquisition and analysis. Methods The participants convened to discuss the various aspects of image acquisition and the different scores, and a consensus statement was drafted using the Delphi methodology. Results The participants identified the following five key points for optimal image acquisition: five sites per organ, avoidance of pressure artifacts, elimination of secretions, adequate focus and contrast adjustment, and recording quality. The scores that can be used to describe numerically the microcirculatory images consist of the following: a measure of vessel density (total and perfused vessel density; two indices of perfusion of the vessels (proportion of perfused vessels and microcirculatory flow index); and a heterogeneity index. In addition, this information should be provided for all vessels and for small vessels (mostly capillaries) identified as smaller than 20 μm. Venular perfusion should be reported as a quality control index, because venules should always be perfused in the absence of pressure artifact. It is anticipated that although this information is currently obtained manually, it is likely that image analysis software will ease analysis in the future. Conclusion We proposed that scoring of the microcirculation should include an index of vascular density, assessment of capillary perfusion and a heterogeneity index. PMID:17845716
Hertanto, Agung; Zhang, Qinghui; Hu, Yu-Chi; Dzyubak, Oleksandr; Rimner, Andreas; Mageras, Gig S
2012-06-01
Respiration-correlated CT (RCCT) images produced with commonly used phase-based sorting of CT slices often exhibit discontinuity artifacts between CT slices, caused by cycle-to-cycle amplitude variations in respiration. Sorting based on the displacement of the respiratory signal yields slices at more consistent respiratory motion states and hence reduces artifacts, but missing image data (gaps) may occur. The authors report on the application of a respiratory motion model to produce an RCCT image set with reduced artifacts and without missing data. Input data consist of CT slices from a cine CT scan acquired while recording respiration by monitoring abdominal displacement. The model-based generation of RCCT images consists of four processing steps: (1) displacement-based sorting of CT slices to form volume images at 10 motion states over the cycle; (2) selection of a reference image without gaps and deformable registration between the reference image and each of the remaining images; (3) generation of the motion model by applying a principal component analysis to establish a relationship between displacement field and respiration signal at each motion state; (4) application of the motion model to deform the reference image into images at the 9 other motion states. Deformable image registration uses a modified fast free-form algorithm that excludes zero-intensity voxels, caused by missing data, from the image similarity term in the minimization function. In each iteration of the minimization, the displacement field in the gap regions is linearly interpolated from nearest neighbor nonzero intensity slices. Evaluation of the model-based RCCT examines three types of image sets: cine scans of a physical phantom programmed to move according to a patient respiratory signal, NURBS-based cardiac torso (NCAT) software phantom, and patient thoracic scans. Comparison in physical motion phantom shows that object distortion caused by variable motion amplitude in phase-based sorting is visibly reduced with model-based RCCT. Comparison of model-based RCCT to original NCAT images as ground truth shows best agreement at motion states whose displacement-sorted images have no missing slices, with mean and maximum discrepancies in lung of 1 and 3 mm, respectively. Larger discrepancies correlate with motion states having a larger number of missing slices in the displacement-sorted images. Artifacts in patient images at different motion states are also reduced. Comparison with displacement-sorted patient images as a ground truth shows that the model-based images closely reproduce the ground truth geometry at different motion states. Results in phantom and patient images indicate that the proposed method can produce RCCT image sets with reduced artifacts relative to phase-sorted images, without the gaps inherent in displacement-sorted images. The method requires a reference image at one motion state that has no missing data. Highly irregular breathing patterns can affect the method's performance, by introducing artifacts in the reference image (although reduced relative to phase-sorted images), or in decreased accuracy in the image prediction of motion states containing large regions of missing data. © 2012 American Association of Physicists in Medicine.
Kmeans-ICA based automatic method for ocular artifacts removal in a motorimagery classification.
Bou Assi, Elie; Rihana, Sandy; Sawan, Mohamad
2014-01-01
Electroencephalogram (EEG) recordings aroused as inputs of a motor imagery based BCI system. Eye blinks contaminate the spectral frequency of the EEG signals. Independent Component Analysis (ICA) has been already proved for removing these artifacts whose frequency band overlap with the EEG of interest. However, already ICA developed methods, use a reference lead such as the ElectroOculoGram (EOG) to identify the ocular artifact components. In this study, artifactual components were identified using an adaptive thresholding by means of Kmeans clustering. The denoised EEG signals have been fed into a feature extraction algorithm extracting the band power, the coherence and the phase locking value and inserted into a linear discriminant analysis classifier for a motor imagery classification.
SU-E-I-38: Improved Metal Artifact Correction Using Adaptive Dual Energy Calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, X; Elder, E; Roper, J
2015-06-15
Purpose: The empirical dual energy calibration (EDEC) method corrects for beam-hardening artifacts, but shows limited performance on metal artifact correction. In this work, we propose an adaptive dual energy calibration (ADEC) method to correct for metal artifacts. Methods: The empirical dual energy calibration (EDEC) method corrects for beam-hardening artifacts, but shows limited performance on metal artifact correction. In this work, we propose an adaptive dual energy calibration (ADEC) method to correct for metal artifacts. Results: Highly attenuating copper rods cause severe streaking artifacts on standard CT images. EDEC improves the image quality, but cannot eliminate the streaking artifacts. Compared tomore » EDEC, the proposed ADEC method further reduces the streaking resulting from metallic inserts and beam-hardening effects and obtains material decomposition images with significantly improved accuracy. Conclusion: We propose an adaptive dual energy calibration method to correct for metal artifacts. ADEC is evaluated with the Shepp-Logan phantom, and shows superior metal artifact correction performance. In the future, we will further evaluate the performance of the proposed method with phantom and patient data.« less
Takayanagi, Tomoya; Arai, Takehiro; Amanuma, Makoto; Sano, Tomonari; Ichiba, Masato; Ishizaka, Kazumasa; Sekine, Takako; Matsutani, Hideyuki; Morita, Hitomi; Takase, Shinichi
2017-01-01
Coronary computed tomography angiography (CCTA) in patients with pacemaker suffers from metallic lead-induced artifacts, which often interfere with accurate assessment of coronary luminal stenosis. The purpose of this study was to assess a frequency of the lead-induced artifacts and artifact-suppression effect by the single energy metal artifact reduction (SEMAR) technique. Forty-one patients with a dual-chamber pacemaker were evaluated using a 320 multi-detector row CT (MDCT). Among them, 22 patients with motion-free full data reconstruction images were the final candidates. Images with and without the SMEAR technique were subjectively compared, and the degree of metallic artifacts was compared. On images without SEMAR, severe metallic artifacts were often observed in the right coronary artery (#1, #2, #3) and distal anterior descending branch (#8). These artifacts were effectively suppressed by SEMAR, and the luminal accessibility was significantly improved in #3 and #8. While pacemaker leads often cause metallic-induced artifacts, SEMAR technique reduced the artifacts and significantly improved the accessibility of coronary lumen in #3 and #8.
Jekova, Irena; Krasteva, Vessela; Ménétré, Sarah; Stoyanov, Todor; Christov, Ivaylo; Fleischhackl, Roman; Schmid, Johann-Jakob; Didon, Jean-Philippe
2009-07-01
This paper presents a bench study on a commercial automated external defibrillator (AED). The objective was to evaluate the performance of the defibrillation advisory system and its robustness against electromagnetic interferences (EMI) with central frequencies of 16.7, 50 and 60 Hz. The shock advisory system uses two 50 and 60 Hz band-pass filters, an adaptive filter to identify and suppress 16.7 Hz interference, and a software technique for arrhythmia analysis based on morphology and frequency ECG parameters. The testing process includes noise-free ECG strips from the internationally recognized MIT-VFDB ECG database that were superimposed with simulated EMI artifacts and supplied to the shock advisory system embedded in a real AED. Measurements under special consideration of the allowed variation of EMI frequency (15.7-17.4, 47-52, 58-62 Hz) and amplitude (1 and 8 mV) were performed to optimize external validity. The accuracy was reported using the American Heart Association (AHA) recommendations for arrhythmia analysis performance. In the case of artifact-free signals, the AHA performance goals were exceeded for both sensitivity and specificity: 99% for ventricular fibrillation (VF), 98% for rapid ventricular tachycardia (VT), 90% for slow VT, 100% for normal sinus rhythm, 100% for asystole and 99% for other non-shockable rhythms. In the presence of EMI, the specificity for some non-shockable rhythms (NSR, N) may be affected in some specific cases of a low signal-to-noise ratio and extreme frequencies, leading to a drop in the specificity with no more than 7% point. The specificity for asystole and the sensitivity for VF and rapid VT in the presence of any kind of 16.7, 50 or 60 Hz EMI simulated artifact were shown to reach the equivalence of sensitivity required for non-noisy signals. In conclusion, we proved that the shock advisory system working in a real AED operates accurately according to the AHA recommendations without artifacts and in the presence of EMI. The results may be affected for specificity in the case of a low signal-to-noise ratio or in some extreme frequency setting.
Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging
Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895
Simultaneous analysis and quality assurance for diffusion tensor imaging.
Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.
Artifacts reduction in VIR/Dawn data.
Carrozzo, F G; Raponi, A; De Sanctis, M C; Ammannito, E; Giardino, M; D'Aversa, E; Fonte, S; Tosi, F
2016-12-01
Remote sensing images are generally affected by different types of noise that degrade the quality of the spectral data (i.e., stripes and spikes). Hyperspectral images returned by a Visible and InfraRed (VIR) spectrometer onboard the NASA Dawn mission exhibit residual systematic artifacts. VIR is an imaging spectrometer coupling high spectral and spatial resolutions in the visible and infrared spectral domain (0.25-5.0 μm). VIR data present one type of noise that may mask or distort real features (i.e., spikes and stripes), which may lead to misinterpretation of the surface composition. This paper presents a technique for the minimization of artifacts in VIR data that include a new instrument response function combining ground and in-flight radiometric measurements, correction of spectral spikes, odd-even band effects, systematic vertical stripes, high-frequency noise, and comparison with ground telescopic spectra of Vesta and Ceres. We developed a correction of artifacts in a two steps process: creation of the artifacts matrix and application of the same matrix to the VIR dataset. In the approach presented here, a polynomial function is used to fit the high frequency variations. After applying these corrections, the resulting spectra show improvements of the quality of the data. The new calibrated data enhance the significance of results from the spectral analysis of Vesta and Ceres.
Isolating gait-related movement artifacts in electroencephalography during human walking
Kline, Julia E.; Huang, Helen J.; Snyder, Kristine L.; Ferris, Daniel P.
2016-01-01
Objective High-density electroencephelography (EEG) can provide insight into human brain function during real-world activities with walking. Some recent studies have used EEG to characterize brain activity during walking, but the relative contributions of movement artifact and electrocortical activity have been difficult to quantify. We aimed to characterize movement artifact recorded by EEG electrodes at a range of walking speeds and to test the efficacy of artifact removal methods. We also quantified the similarity between movement artifact recorded by EEG electrodes and a head-mounted accelerometer. Approach We used a novel experimental method to isolate and record movement artifact with EEG electrodes during walking. We blocked electrophysiological signals using a nonconductive layer (silicone swim cap) and simulated an electrically conductive scalp on top of the swim cap using a wig coated with conductive gel. We recorded motion artifact EEG data from nine young human subjects walking on a treadmill at speeds from 0.4–1.6 m/s. We then tested artifact removal methods including moving average and wavelet-based techniques. Main Results Movement artifact recorded with EEG electrodes varied considerably, across speed, subject, and electrode location. The movement artifact measured with EEG electrodes did not correlate well with head acceleration. All of the tested artifact removal methods attenuated low-frequency noise but did not completely remove movement artifact. The spectral power fluctuations in the movement artifact data resembled data from some previously published studies of EEG during walking. Significance Our results suggest that EEG data recorded during walking likely contains substantial movement artifact that: cannot be explained by head accelerations; varies across speed, subject, and channel; and cannot be removed using traditional signal processing methods. Future studies should focus on more sophisticated methods for removing of EEG movement artifact to advance the field. PMID:26083595
Isolating gait-related movement artifacts in electroencephalography during human walking.
Kline, Julia E; Huang, Helen J; Snyder, Kristine L; Ferris, Daniel P
2015-08-01
High-density electroencephelography (EEG) can provide an insight into human brain function during real-world activities with walking. Some recent studies have used EEG to characterize brain activity during walking, but the relative contributions of movement artifact and electrocortical activity have been difficult to quantify. We aimed to characterize movement artifact recorded by EEG electrodes at a range of walking speeds and to test the efficacy of artifact removal methods. We also quantified the similarity between movement artifact recorded by EEG electrodes and a head-mounted accelerometer. We used a novel experimental method to isolate and record movement artifact with EEG electrodes during walking. We blocked electrophysiological signals using a nonconductive layer (silicone swim cap) and simulated an electrically conductive scalp on top of the swim cap using a wig coated with conductive gel. We recorded motion artifact EEG data from nine young human subjects walking on a treadmill at speeds from 0.4 to 1.6 m s(-1). We then tested artifact removal methods including moving average and wavelet-based techniques. Movement artifact recorded with EEG electrodes varied considerably, across speed, subject, and electrode location. The movement artifact measured with EEG electrodes did not correlate well with head acceleration. All of the tested artifact removal methods attenuated low-frequency noise but did not completely remove movement artifact. The spectral power fluctuations in the movement artifact data resembled data from some previously published studies of EEG during walking. Our results suggest that EEG data recorded during walking likely contains substantial movement artifact that: cannot be explained by head accelerations; varies across speed, subject, and channel; and cannot be removed using traditional signal processing methods. Future studies should focus on more sophisticated methods for removal of EEG movement artifact to advance the field.
3D artifact for calibrating kinematic parameters of articulated arm coordinate measuring machines
NASA Astrophysics Data System (ADS)
Zhao, Huining; Yu, Liandong; Xia, Haojie; Li, Weishi; Jiang, Yizhou; Jia, Huakun
2018-06-01
In this paper, a 3D artifact is proposed to calibrate the kinematic parameters of articulated arm coordinate measuring machines (AACMMs). The artifact is composed of 14 reference points with three different heights, which provides 91 different reference lengths, and a method is proposed to calibrate the artifact with laser tracker multi-stations. Therefore, the kinematic parameters of an AACMM can be calibrated in one setup of the proposed artifact, instead of having to adjust the 1D or 2D artifacts to different positions and orientations in the existing methods. As a result, it saves time to calibrate the AACMM with the proposed artifact in comparison with the traditional 1D or 2D artifacts. The performance of the AACMM calibrated with the proposed artifact is verified with a 600.003 mm gauge block. The result shows that the measurement accuracy of the AACMM is improved effectively through calibration with the proposed artifact.
Mesoscale hybrid calibration artifact
Tran, Hy D.; Claudet, Andre A.; Oliver, Andrew D.
2010-09-07
A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.
Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.
Kling-Petersen, T; Rydmark, M
2000-01-01
The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.
Ripple artifact reduction using slice overlap in slice encoding for metal artifact correction.
den Harder, J Chiel; van Yperen, Gert H; Blume, Ulrike A; Bos, Clemens
2015-01-01
Multispectral imaging (MSI) significantly reduces metal artifacts. Yet, especially in techniques that use gradient selection, such as slice encoding for metal artifact correction (SEMAC), a residual ripple artifact may be prominent. Here, an analysis is presented of the ripple artifact and of slice overlap as an approach to reduce the artifact. The ripple artifact was analyzed theoretically to clarify its cause. Slice overlap, conceptually similar to spectral bin overlap in multi-acquisition with variable resonances image combination (MAVRIC), was achieved by reducing the selection gradient and, thus, increasing the slice profile width. Time domain simulations and phantom experiments were performed to validate the analyses and proposed solution. Discontinuities between slices are aggravated by signal displacement in the frequency encoding direction in areas with deviating B0. Specifically, it was demonstrated that ripple artifacts appear only where B0 varies both in-plane and through-plane. Simulations and phantom studies of metal implants confirmed the efficacy of slice overlap to reduce the artifact. The ripple artifact is an important limitation of gradient selection based MSI techniques, and can be understood using the presented simulations. At a scan-time penalty, slice overlap effectively addressed the artifact, thereby improving image quality near metal implants. © 2014 Wiley Periodicals, Inc.
Kanawati, Basem; Bader, Theresa M; Wanczek, Karl-Peter; Li, Yan; Schmitt-Kopplin, Philippe
2017-10-15
Peak picking algorithms in mass spectrometry face the challenge of picking the correct signals from a mass spectrum. In some cases signal wiggles (side lobes) are also chosen in the produced mass list as if they were real signals. Constraints which are defined in such algorithms do not always guarantee wiggle-free accurate mass list generation out of raw mass spectra. This problem intensifies with acquisitions, which are accompanied by longer transients. Thus, the problem represents a contemporary issue, which propagates with modern high-memory digitizers and exists in both MS and MS/MS spectra. A solariX FTMS mass spectrometer with an Infinity ICR cell (Bruker Daltonics, Bremen, Germany) coupled to a 12 Tesla magnet (Magnex, UK) was used for the experimental study. Time-domain transients of several different data point lengths 512k, 1M, 2M, 4M, 8M were obtained and were Fourier-transformed to obtain frequency spectra which show the effect of the transient truncation on sinc wiggle developments in FT-ICR-MS. MATLAB simulations were also performed to investigate the origin of the Fourier transform (FT)-artifacts. A new filter has been developed to identify and remove FT-artifacts (sinc side lobes) from both frequency and mass spectra. The newly developed filter is based on distinguishing between the FWHM of the correct frequency/mass signals and the FWHM of their corresponding wiggles. The filter draws a reliable confidence limit of resolution range, within which a correct frequency/mass signal is identified. The filter is applicable over a wide mass range of metabolic interest (100-1200 amu). The origin of FT-artifacts due to time-domain transient truncations was thoroughly investigated both experimentally and by simulations in this study. A new solution for this problem with automatic recognition and elimination of these FT-artifacts (side lobes/wiggles) is provided, which is independent of any intensity thresholds, magnetic field strengths and time-domain transient lengths. Copyright © 2017 John Wiley & Sons, Ltd.
Howard, Jeffrey L; Olszewska, Dorota
2011-03-01
An urban soil chronosequence in downtown Detroit, MI was studied to determine the effects of time on pedogenesis and heavy metal sequestration. The soils developed in fill derived from mixed sandy and clayey diamicton parent materials on a level late Pleistocene lakebed plain under grass vegetation in a humid-temperate (mesic) climate. The chronosequence is comprised of soils in vacant lots (12 and 44 years old) and parks (96 and 120 years old), all located within 100 m of a roadway. An A-horizon 16 cm thick with 2% organic matter has developed after only 12 years of pedogenesis. The 12 year-old soil shows accelerated weathering of iron (e.g. nails) and cement artifacts attributed to corrosion by excess soluble salts of uncertain origin. Carbonate and Fe-oxide are immobilizing agents for heavy metals, hence it is recommended that drywall, plaster, cement and iron artifacts be left in soils at brownfield sites for their ameliorating effects. Copyright © 2010 Elsevier Ltd. All rights reserved.
Quality improving techniques for free-viewpoint DIBR
NASA Astrophysics Data System (ADS)
Do, Luat; Zinger, Sveta; de With, Peter H. N.
2010-02-01
Interactive free-viewpoint selection applied to a 3D multi-view signal is a possible attractive feature of the rapidly developing 3D TV media. This paper explores a new rendering algorithm that computes a free-viewpoint based on depth image warping between two reference views from existing cameras. We have developed three quality enhancing techniques that specifically aim at solving the major artifacts. First, resampling artifacts are filled in by a combination of median filtering and inverse warping. Second, contour artifacts are processed while omitting warping of edges at high discontinuities. Third, we employ a depth signal for more accurate disocclusion inpainting. We obtain an average PSNR gain of 3 dB and 4.5 dB for the 'Breakdancers' and 'Ballet' sequences, respectively, compared to recently published results. While experimenting with synthetic data, we observe that the rendering quality is highly dependent on the complexity of the scene. Moreover, experiments are performed using compressed video from surrounding cameras. The overall system quality is dominated by the rendering quality and not by coding.
An Additive Manufacturing Test Artifact
Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan
2014-01-01
A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039
Classification and simulation of stereoscopic artifacts in mobile 3DTV content
NASA Astrophysics Data System (ADS)
Boev, Atanas; Hollosi, Danilo; Gotchev, Atanas; Egiazarian, Karen
2009-02-01
We identify, categorize and simulate artifacts which might occur during delivery stereoscopic video to mobile devices. We consider the stages of 3D video delivery dataflow: content creation, conversion to the desired format (multiview or source-plus-depth), coding/decoding, transmission, and visualization on 3D display. Human 3D vision works by assessing various depth cues - accommodation, binocular depth cues, pictorial cues and motion parallax. As a consequence any artifact which modifies these cues impairs the quality of a 3D scene. The perceptibility of each artifact can be estimated through subjective tests. The material for such tests needs to contain various artifacts with different amounts of impairment. We present a system for simulation of these artifacts. The artifacts are organized in groups with similar origins, and each group is simulated by a block in a simulation channel. The channel introduces the following groups of artifacts: sensor limitations, geometric distortions caused by camera optics, spatial and temporal misalignments between video channels, spatial and temporal artifacts caused by coding, transmission losses, and visualization artifacts. For the case of source-plus-depth representation, artifacts caused by format conversion are added as well.
Gurney-Champion, Oliver J; Bruins Slot, Thijs; Lens, Eelco; van der Horst, Astrid; Klaassen, Remy; van Laarhoven, Hanneke W M; van Tienhoven, Geertjan; van Hooft, Jeanin E; Nederveen, Aart J; Bel, Arjan
2016-10-01
Biliary stents may cause susceptibility artifacts, gradient-induced artifacts, and radio frequency (RF) induced artifacts on magnetic resonance images, which can hinder accurate target volume delineation in radiotherapy. In this study, the authors investigated and quantified the magnitude of these artifacts for stents of different materials. Eight biliary stents made of nitinol, platinum-cored nitinol, stainless steel, or polyethylene from seven vendors, with different lengths (57-98 mm) and diameters (3.0-11.7 mm), were placed in a phantom. To quantify the susceptibility artifacts sequence-independently, ΔB0-maps and T2 ∗ -maps were acquired at 1.5 and 3 T. To study the effect of the gradient-induced artifacts at 3 T, signal decay in images obtained with maximum readout gradient-induced artifacts was compared to signal decay in reference scans. To quantify the RF induced artifacts at 3 T, B1-maps were acquired. Finally, ΔB0-maps and T2 ∗ -maps were acquired at 3 T of two pancreatic cancer patients who had received platinum-cored nitinol biliary stents. Outside the stent, susceptibility artifacts dominated the other artifacts. The stainless steel stent produced the largest susceptibility artifacts. The other stents caused decreased T2 ∗ up to 5.1 mm (1.5 T) and 8.5 mm (3 T) from the edge of the stent. For sequences with a higher bandwidth per voxel (1.5 T: BW vox > 275 Hz/voxel; 3 T: BW vox > 500 Hz/voxel), the B0-related susceptibility artifacts were negligible (<0.2 voxels). The polyethylene stent showed no artifacts. In vivo, the changes in B0 and T2 ∗ induced by the stent were larger than typical variations in B0 and T2 ∗ induced by anatomy when the stent was at an angle of 30° with the main magnetic field. Susceptibility artifacts were dominating over the other artifacts. The magnitudes of the susceptibility artifacts were determined sequence-independently. This method allows to include additional safety margins that ensure target irradiation.
Development of a LED based standard for luminous flux
NASA Astrophysics Data System (ADS)
Sardinha, André; Ázara, Ivo; Torres, Miguel; Menegotto, Thiago; Grieneisen, Hans Peter; Borghi, Giovanna; Couceiro, Iakyra; Zim, Alexandre; Muller, Filipe
2018-03-01
Incandescent lamps, simple artifacts with radiation spectrum very similar to a black-body emitter, are traditional standards in photometry. Nowadays LEDs are broadly used in lighting, with great variety of spectra, and it is convenient to use standards for photometry with spectral distribution similar to that of the measured artifact. Research and development of such standards occur in several National Metrology Institutes. In Brazil, Inmetro is working on a practical solution for providing a LED based standard to be used for luminous flux measurements in the field of general lighting. This paper shows the measurements made for the developing of a prototype, that in sequence will be characterized in photometric quantities.
Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M
2014-01-01
Digital breast tomosynthesis (DBT) has strong promise to improve sensitivity for detecting breast cancer. DBT reconstruction estimates the breast tissue attenuation using projection views (PVs) acquired in a limited angular range. Because of the limited field of view (FOV) of the detector, the PVs may not completely cover the breast in the x-ray source motion direction at large projection angles. The voxels in the imaged volume cannot be updated when they are outside the FOV, thus causing a discontinuity in intensity across the FOV boundaries in the reconstructed slices, which we refer to as the truncated projection artifact (TPA). Most existing TPA reduction methods were developed for the filtered backprojection method in the context of computed tomography. In this study, we developed a new diffusion-based method to reduce TPAs during DBT reconstruction using the simultaneous algebraic reconstruction technique (SART). Our TPA reduction method compensates for the discontinuity in background intensity outside the FOV of the current PV after each PV updating in SART. The difference in voxel values across the FOV boundary is smoothly diffused to the region beyond the FOV of the current PV. Diffusion-based background intensity estimation is performed iteratively to avoid structured artifacts. The method is applicable to TPA in both the forward and backward directions of the PVs and for any number of iterations during reconstruction. The effectiveness of the new method was evaluated by comparing the visual quality of the reconstructed slices and the measured discontinuities across the TPA with and without artifact correction at various iterations. The results demonstrated that the diffusion-based intensity compensation method reduced the TPA while preserving the detailed tissue structures. The visibility of breast lesions obscured by the TPA was improved after artifact reduction. PMID:23318346
NASA Astrophysics Data System (ADS)
Sauppe, Sebastian; Hahn, Andreas; Brehm, Marcus; Paysan, Pascal; Seghers, Dieter; Kachelrieß, Marc
2016-03-01
We propose an adapted method of our previously published five-dimensional (5D) motion compensation (MoCo) algorithm1, developed for micro-CT imaging of small animals, to provide for the first time motion artifact-free 5D cone-beam CT (CBCT) images from a conventional flat detector-based CBCT scan of clinical patients. Image quality of retrospectively respiratory- and cardiac-gated volumes from flat detector CBCT scans is deteriorated by severe sparse projection artifacts. These artifacts further complicate motion estimation, as it is required for MoCo image reconstruction. For high quality 5D CBCT images at the same x-ray dose and the same number of projections as todays 3D CBCT we developed a double MoCo approach based on motion vector fields (MVFs) for respiratory and cardiac motion. In a first step our already published four-dimensional (4D) artifact-specific cyclic motion-compensation (acMoCo) approach is applied to compensate for the respiratory patient motion. With this information a cyclic phase-gated deformable heart registration algorithm is applied to the respiratory motion-compensated 4D CBCT data, thus resulting in cardiac MVFs. We apply these MVFs on double-gated images and thereby respiratory and cardiac motion-compensated 5D CBCT images are obtained. Our 5D MoCo approach processing patient data acquired with the TrueBeam 4D CBCT system (Varian Medical Systems). Our double MoCo approach turned out to be very efficient and removed nearly all streak artifacts due to making use of 100% of the projection data for each reconstructed frame. The 5D MoCo patient data show fine details and no motion blurring, even in regions close to the heart where motion is fastest.
Use of cognitive artifacts in chemistry learning
NASA Astrophysics Data System (ADS)
Yengin, Ilker
In everyday life, we interact with cognitive artifacts to receive and/or manipulate information so as to alter our thinking processes. CHEM/TEAC 869Q is a distance course that includes extensive explicit instruction in the use of a cognitive artifact. This study investigates issues related to the design of that online artifact. In order to understand design implications and how cognitive artifacts contribute to students' thinking and learning, a qualitative research methodology was engaged that utilized think aloud sessions. Participants' described constrained and structured cognitive models while using the artifact. The study also was informed by interviews and researcher's field notes. A purposeful sampling method led to the selection of participants, four males and two females, who had no prior history of using a course from the 869 series but who had experienced the scientific content covered by the CHEM869Q course. Analysis of the results showed both that a cognitive artifact may lead users' minds in decision making, and that problem solving processes were affected by cognitive artifact's design. When there is no design flaw, users generally thought that the cognitive artifact was helpful by simplifying steps, overcoming other limitations, and reducing errors in a reliable, effective, and easy to use way. Moreover, results showed that successful implementation of cognitive artifacts into teaching --learning practices depended on user willingness to transfer a task to the artifact. While users may like the idea of benefiting from a cognitive artifact, nevertheless, they may tend to limit their usage. They sometimes think that delegating a task to a cognitive artifact makes them dependent, and that they may not learn how to perform the tasks by themselves. They appear more willing to use a cognitive artifact after they have done the task by themselves.
Improved Image Quality in Head and Neck CT Using a 3D Iterative Approach to Reduce Metal Artifact.
Wuest, W; May, M S; Brand, M; Bayerl, N; Krauss, A; Uder, M; Lell, M
2015-10-01
Metal artifacts from dental fillings and other devices degrade image quality and may compromise the detection and evaluation of lesions in the oral cavity and oropharynx by CT. The aim of this study was to evaluate the effect of iterative metal artifact reduction on CT of the oral cavity and oropharynx. Data from 50 consecutive patients with metal artifacts from dental hardware were reconstructed with standard filtered back-projection, linear interpolation metal artifact reduction (LIMAR), and iterative metal artifact reduction. The image quality of sections that contained metal was analyzed for the severity of artifacts and diagnostic value. A total of 455 sections (mean ± standard deviation, 9.1 ± 4.1 sections per patient) contained metal and were evaluated with each reconstruction method. Sections without metal were not affected by the algorithms and demonstrated image quality identical to each other. Of these sections, 38% were considered nondiagnostic with filtered back-projection, 31% with LIMAR, and only 7% with iterative metal artifact reduction. Thirty-three percent of the sections had poor image quality with filtered back-projection, 46% with LIMAR, and 10% with iterative metal artifact reduction. Thirteen percent of the sections with filtered back-projection, 17% with LIMAR, and 22% with iterative metal artifact reduction were of moderate image quality, 16% of the sections with filtered back-projection, 5% with LIMAR, and 30% with iterative metal artifact reduction were of good image quality, and 1% of the sections with LIMAR and 31% with iterative metal artifact reduction were of excellent image quality. Iterative metal artifact reduction yields the highest image quality in comparison with filtered back-projection and linear interpolation metal artifact reduction in patients with metal hardware in the head and neck area. © 2015 by American Journal of Neuroradiology.
Iterative image-domain ring artifact removal in cone-beam CT
NASA Astrophysics Data System (ADS)
Liang, Xiaokun; Zhang, Zhicheng; Niu, Tianye; Yu, Shaode; Wu, Shibin; Li, Zhicheng; Zhang, Huailing; Xie, Yaoqin
2017-07-01
Ring artifacts in cone beam computed tomography (CBCT) images are caused by pixel gain variations using flat-panel detectors, and may lead to structured non-uniformities and deterioration of image quality. The purpose of this study is to propose a method of general ring artifact removal in CBCT images. This method is based on the polar coordinate system, where the ring artifacts manifest as stripe artifacts. Using relative total variation, the CBCT images are first smoothed to generate template images with fewer image details and ring artifacts. By subtracting the template images from the CBCT images, residual images with image details and ring artifacts are generated. As the ring artifact manifests as a stripe artifact in a polar coordinate system, the artifact image can be extracted by mean value from the residual image; the image details are generated by subtracting the artifact image from the residual image. Finally, the image details are compensated to the template image to generate the corrected images. The proposed framework is iterated until the differences in the extracted ring artifacts are minimized. We use a 3D Shepp-Logan phantom, Catphan©504 phantom, uniform acrylic cylinder, and images from a head patient to evaluate the proposed method. In the experiments using simulated data, the spatial uniformity is increased by 1.68 times and the structural similarity index is increased from 87.12% to 95.50% using the proposed method. In the experiment using clinical data, our method shows high efficiency in ring artifact removal while preserving the image structure and detail. The iterative approach we propose for ring artifact removal in cone-beam CT is practical and attractive for CBCT guided radiation therapy.
Stidd, D A; Theessen, H; Deng, Y; Li, Y; Scholz, B; Rohkohl, C; Jhaveri, M D; Moftakhar, R; Chen, M; Lopes, D K
2014-01-01
Flat panel detector CT images are degraded by streak artifacts caused by radiodense implanted materials such as coils or clips. A new metal artifacts reduction prototype algorithm has been used to minimize these artifacts. The application of this new metal artifacts reduction algorithm was evaluated for flat panel detector CT imaging performed in a routine clinical setting. Flat panel detector CT images were obtained from 59 patients immediately following cerebral endovascular procedures or as surveillance imaging for cerebral endovascular or surgical procedures previously performed. The images were independently evaluated by 7 physicians for metal artifacts reduction on a 3-point scale at 2 locations: immediately adjacent to the metallic implant and 3 cm away from it. The number of visible vessels before and after metal artifacts reduction correction was also evaluated within a 3-cm radius around the metallic implant. The metal artifacts reduction algorithm was applied to the 59 flat panel detector CT datasets without complications. The metal artifacts in the reduction-corrected flat panel detector CT images were significantly reduced in the area immediately adjacent to the implanted metal object (P = .05) and in the area 3 cm away from the metal object (P = .03). The average number of visible vessel segments increased from 4.07 to 5.29 (P = .1235) after application of the metal artifacts reduction algorithm to the flat panel detector CT images. Metal artifacts reduction is an effective method to improve flat panel detector CT images degraded by metal artifacts. Metal artifacts are significantly decreased by the metal artifacts reduction algorithm, and there was a trend toward increased vessel-segment visualization. © 2014 by American Journal of Neuroradiology.
Gaussian diffusion sinogram inpainting for X-ray CT metal artifact reduction.
Peng, Chengtao; Qiu, Bensheng; Li, Ming; Guan, Yihui; Zhang, Cheng; Wu, Zhongyi; Zheng, Jian
2017-01-05
Metal objects implanted in the bodies of patients usually generate severe streaking artifacts in reconstructed images of X-ray computed tomography, which degrade the image quality and affect the diagnosis of disease. Therefore, it is essential to reduce these artifacts to meet the clinical demands. In this work, we propose a Gaussian diffusion sinogram inpainting metal artifact reduction algorithm based on prior images to reduce these artifacts for fan-beam computed tomography reconstruction. In this algorithm, prior information that originated from a tissue-classified prior image is used for the inpainting of metal-corrupted projections, and it is incorporated into a Gaussian diffusion function. The prior knowledge is particularly designed to locate the diffusion position and improve the sparsity of the subtraction sinogram, which is obtained by subtracting the prior sinogram of the metal regions from the original sinogram. The sinogram inpainting algorithm is implemented through an approach of diffusing prior energy and is then solved by gradient descent. The performance of the proposed metal artifact reduction algorithm is compared with two conventional metal artifact reduction algorithms, namely the interpolation metal artifact reduction algorithm and normalized metal artifact reduction algorithm. The experimental datasets used included both simulated and clinical datasets. By evaluating the results subjectively, the proposed metal artifact reduction algorithm causes fewer secondary artifacts than the two conventional metal artifact reduction algorithms, which lead to severe secondary artifacts resulting from impertinent interpolation and normalization. Additionally, the objective evaluation shows the proposed approach has the smallest normalized mean absolute deviation and the highest signal-to-noise ratio, indicating that the proposed method has produced the image with the best quality. No matter for the simulated datasets or the clinical datasets, the proposed algorithm has reduced the metal artifacts apparently.
Edge enhancement algorithm for low-dose X-ray fluoroscopic imaging.
Lee, Min Seok; Park, Chul Hee; Kang, Moon Gi
2017-12-01
Low-dose X-ray fluoroscopy has continually evolved to reduce radiation risk to patients during clinical diagnosis and surgery. However, the reduction in dose exposure causes quality degradation of the acquired images. In general, an X-ray device has a time-average pre-processor to remove the generated quantum noise. However, this pre-processor causes blurring and artifacts within the moving edge regions, and noise remains in the image. During high-pass filtering (HPF) to enhance edge detail, this noise in the image is amplified. In this study, a 2D edge enhancement algorithm comprising region adaptive HPF with the transient improvement (TI) method, as well as artifacts and noise reduction (ANR), was developed for degraded X-ray fluoroscopic images. The proposed method was applied in a static scene pre-processed by a low-dose X-ray fluoroscopy device. First, the sharpness of the X-ray image was improved using region adaptive HPF with the TI method, which facilitates sharpening of edge details without overshoot problems. Then, an ANR filter that uses an edge directional kernel was developed to remove the artifacts and noise that can occur during sharpening, while preserving edge details. The quantitative and qualitative results obtained by applying the developed method to low-dose X-ray fluoroscopic images and visually and numerically comparing the final images with images improved using conventional edge enhancement techniques indicate that the proposed method outperforms existing edge enhancement methods in terms of objective criteria and subjective visual perception of the actual X-ray fluoroscopic image. The developed edge enhancement algorithm performed well when applied to actual low-dose X-ray fluoroscopic images, not only by improving the sharpness, but also by removing artifacts and noise, including overshoot. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jechel, Christopher Alexander
In radiotherapy planning, computed tomography (CT) images are used to quantify the electron density of tissues and provide spatial anatomical information. Treatment planning systems use these data to calculate the expected spatial distribution of absorbed dose in a patient. CT imaging is complicated by the presence of metal implants which cause increased image noise, produce artifacts throughout the image and can exceed the available range of CT number values within the implant, perturbing electron density estimates in the image. Furthermore, current dose calculation algorithms do not accurately model radiation transport at metal-tissue interfaces. Combined, these issues adversely affect the accuracy of dose calculations in the vicinity of metal implants. As the number of patients with orthopedic and dental implants grows, so does the need to deliver safe and effective radiotherapy treatments in the presence of implants. The Medical Physics group at the Cancer Centre of Southeastern Ontario and Queen's University has developed a Cobalt-60 CT system that is relatively insensitive to metal artifacts due to the high energy, nearly monoenergetic Cobalt-60 photon beam. Kilovoltage CT (kVCT) images, including images corrected using a commercial metal artifact reduction tool, were compared to Cobalt-60 CT images throughout the treatment planning process, from initial imaging through to dose calculation. An effective metal artifact reduction algorithm was also implemented for the Cobalt-60 CT system. Electron density maps derived from the same kVCT and Cobalt-60 CT images indicated the impact of image artifacts on estimates of photon attenuation for treatment planning applications. Measurements showed that truncation of CT number data in kVCT images produced significant mischaracterization of the electron density of metals. Dose measurements downstream of metal inserts in a water phantom were compared to dose data calculated using CT images from kVCT and Cobalt-60 systems with and without artifact correction. The superior accuracy of electron density data derived from Cobalt-60 images compared to kVCT images produced calculated dose with far better agreement with measured results. These results indicated that dose calculation errors from metal image artifacts are primarily due to misrepresentation of electron density within metals rather than artifacts surrounding the implants.
Li, Jianqi; Wang, Yi; Jiang, Yu; Xie, Haibin; Li, Gengying
2009-09-01
An open permanent magnet system with vertical B(0) field and without self-shielding can be quite susceptible to perturbations from external magnetic sources. B(0) variation in such a system located close to a subway station was measured to be greater than 0.7 microT by both MRI and a fluxgate magnetometer. This B(0) variation caused image artifacts. A navigator echo approach that monitored and compensated the view-to-view variation in magnetic resonance signal phase was developed to correct for image artifacts. Human brain imaging experiments using a multislice gradient-echo sequence demonstrated that the ghosting and blurring artifacts associated with B(0) variations were effectively removed using the navigator method.
Removing Contamination-Induced Reconstruction Artifacts from Cryo-electron Tomograms
Fernandez, Jose-Jesus; Laugks, Ulrike; Schaffer, Miroslava; Bäuerlein, Felix J.B.; Khoshouei, Maryam; Baumeister, Wolfgang; Lucic, Vladan
2016-01-01
Imaging of fully hydrated, vitrified biological samples by electron tomography yields structural information about cellular protein complexes in situ. Here we present a computational procedure that removes artifacts of three-dimensional reconstruction caused by contamination present in samples during imaging by electron microscopy. Applying the procedure to phantom data and electron tomograms of cellular samples significantly improved the resolution and the interpretability of tomograms. Artifacts caused by surface contamination associated with thinning by focused ion beam, as well as those arising from gold fiducial markers and from common, lower contrast contamination, could be removed. Our procedure is widely applicable and is especially suited for applications that strive to reach a higher resolution and involve the use of recently developed, state-of-the-art instrumentation. PMID:26743046
Recent progress and outstanding issues in motion correction in resting state fMRI
Power, Jonathan D; Schlaggar, Bradley L; Petersen, Steven E
2014-01-01
The purpose of this review is to communicate and synthesize recent findings related to motion artifact in resting state fMRI. In 2011, three groups reported that small head movements produced spurious but structured noise in brain scans, causing distance-dependent changes in signal correlations. This finding has prompted both methods development and the re-examination of prior findings with more stringent motion correction. Since 2011, over a dozen papers have been published specifically on motion artifact in resting state fMRI. We will attempt to distill these papers to their most essential content. We will point out some aspects of motion artifact that are easily or often overlooked. Throughout the review, we will highlight gaps in current knowledge and avenues for future research. PMID:25462692
Enroute flight planning: The design of cooperative planning systems
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Layton, Chuck; Mccoy, Elaine
1990-01-01
Design concepts and principles to guide in the building of cooperative problem solving systems are being developed and evaluated. In particular, the design of cooperative systems for enroute flight planning is being studied. The investigation involves a three stage process, modeling human performance in existing environments, building cognitive artifacts, and studying the performance of people working in collaboration with these artifacts. The most significant design concepts and principles identified thus far are the principle focus.
How semantic category modulates preschool children's visual memory.
Giganti, Fiorenza; Viggiano, Maria Pia
2015-01-01
The dynamic interplay between perception and memory has been explored in preschool children by presenting filtered stimuli regarding animals and artifacts. The identification of filtered images was markedly influenced by both prior exposure and the semantic nature of the stimuli. The identification of animals required less physical information than artifacts did. Our results corroborate the notion that the human attention system evolves to reliably develop definite category-specific selection criteria by which living entities are monitored in different ways.
Clinical Assessment of Mirror Artifacts in Spectral-Domain Optical Coherence Tomography
Ho, Joseph; Castro, Dinorah P. E.; Castro, Leonardo C.; Chen, Yueli; Liu, Jonathan; Mattox, Cynthia; Krishnan, Chandrasekharan; Fujimoto, James G.; Schuman, Joel S.
2010-01-01
Purpose. To investigate the characteristics of a spectral-domain optical coherence tomography (SD-OCT) image phenomenon known as the mirror artifact, calculate its prevalence, analyze potential risk factors, measure severity, and correlate it to spherical equivalent and central visual acuity (VA). Methods. OCT macular cube 512 × 128 scans taken between January 2008 and February 2009 at the New England Eye Center were analyzed for the presence of mirror artifacts. Artifact severity was determined by the degree of segmentation breakdown that it caused on the macular map. A retrospective review was conducted of the medical records of patients with artifacts and of a random control group without artifacts. Results. Of 1592 patients, 9.3% (148 patients, 200 eyes) had scans that contained mirror artifacts. A significantly more myopic spherical equivalent (P < 0.001), worse VA (P < 0.001), longer axial lengths (P = 0.004), and higher proportions of moderate to high myopia (P < 0.001) were found in patients with mirror artifacts than in patients without artifacts. Worse VA was associated with increased artifact severity (P = 0.04). Conclusions. In all scans analyzed, a high prevalence of mirror artifacts was found. This image artifact was often associated with patients with moderate to high myopia. Improvements in instrumentation may be necessary to resolve this problem in moderately and highly myopic eyes. Operators should be advised to properly position the retina when scanning eyes. In cases in which peripheral abnormalities in topographic measurements of retinal thickness are found, corresponding OCT scans should be examined for the presence of mirror artifacts. PMID:20181840
Comtois, Gary; Mendelson, Yitzhak; Ramuka, Piyush
2007-01-01
Wearable physiological monitoring using a pulse oximeter would enable field medics to monitor multiple injuries simultaneously, thereby prioritizing medical intervention when resources are limited. However, a primary factor limiting the accuracy of pulse oximetry is poor signal-to-noise ratio since photoplethysmographic (PPG) signals, from which arterial oxygen saturation (SpO2) and heart rate (HR) measurements are derived, are compromised by movement artifacts. This study was undertaken to quantify SpO2 and HR errors induced by certain motion artifacts utilizing accelerometry-based adaptive noise cancellation (ANC). Since the fingers are generally more vulnerable to motion artifacts, measurements were performed using a custom forehead-mounted wearable pulse oximeter developed for real-time remote physiological monitoring and triage applications. This study revealed that processing motion-corrupted PPG signals by least mean squares (LMS) and recursive least squares (RLS) algorithms can be effective to reduce SpO2 and HR errors during jogging, but the degree of improvement depends on filter order. Although both algorithms produced similar improvements, implementing the adaptive LMS algorithm is advantageous since it requires significantly less operations.
Fun and Games: using Games and Immersive Exploration to Teach Earth and Space Science
NASA Astrophysics Data System (ADS)
Reiff, P. H.; Sumners, C.
2011-12-01
We have been using games to teach Earth and Space Science for over 15 years. Our software "TicTacToe" has been used continuously at the Houston Museum of Natural Science since 2002. It is the single piece of educational software in the "Earth Forum" suite that holds the attention of visitors the longest - averaging over 10 minutes compared to 1-2 minutes for the other software kiosks. We now have question sets covering solar system, space weather, and Earth science. In 2010 we introduced a new game technology - that of immersive interactive explorations. In our "Tikal Explorer", visitors use a game pad to navigate a three-dimensional environment of the Classic Maya city of Tikal. Teams of students climb pyramids, look for artifacts, identify plants and animals, and site astronomical alignments that predict the annual return of the rains. We also have a new 3D exploration of the International Space Station, where students can fly around and inside the ISS. These interactive explorations are very natural to the video-game generation, and promise to bring educational objectives to experiences that had previously been used strictly for gaming. If space permits, we will set up our portable Discovery Dome in the poster session for a full immersive demonstration of these game environments.
Combination of structured illumination and single molecule localization microscopy in one setup
NASA Astrophysics Data System (ADS)
Rossberger, Sabrina; Best, Gerrit; Baddeley, David; Heintzmann, Rainer; Birk, Udo; Dithmar, Stefan; Cremer, Christoph
2013-09-01
Understanding the positional and structural aspects of biological nanostructures simultaneously is as much a challenge as a desideratum. In recent years, highly accurate (20 nm) positional information of optically isolated targets down to the nanometer range has been obtained using single molecule localization microscopy (SMLM), while highly resolved (100 nm) spatial information has been achieved using structured illumination microscopy (SIM). In this paper, we present a high-resolution fluorescence microscope setup which combines the advantages of SMLM with SIM in order to provide high-precision localization and structural information in a single setup. Furthermore, the combination of the wide-field SIM image with the SMLM data allows us to identify artifacts produced during the visualization process of SMLM data, and potentially also during the reconstruction process of SIM images. We describe the SMLM-SIM combo and software, and apply the instrument in a first proof-of-principle to the same region of H3K293 cells to achieve SIM images with high structural resolution (in the 100 nm range) in overlay with the highly accurate position information of localized single fluorophores. Thus, with its robust control software, efficient switching between the SMLM and SIM mode, fully automated and user-friendly acquisition and evaluation software, the SMLM-SIM combo is superior over existing solutions.
Yoshida, Toshihiko; Fukumoto, Takumi; Urade, Takeshi; Kido, Masahiro; Toyama, Hirochika; Asari, Sadaki; Ajiki, Tetsuo; Ikeo, Naoko; Mukai, Toshiji; Ku, Yonson
2017-06-01
Operative clips used to ligate vessels in abdominal operation usually are made of titanium. They remain in the body permanently and form metallic artifacts in computed tomography images, which impair accurate diagnosis. Although biodegradable magnesium instruments have been developed in other fields, the physical properties necessary for operative clips differ from those of other instruments. We developed a biodegradable magnesium-zinc-calcium alloy clip with good biologic compatibility and enough clamping capability as an operative clip. In this study, we verified the safety and tolerability of this clip for use in canine cholecystectomy. Nine female beagles were used. We performed cholecystectomy and ligated the cystic duct by magnesium alloy or titanium clips. The chronologic change of clips and artifact formation were compared at 1, 4, 12, 18, and 24 weeks postoperative by computed tomography. The animals were killed at the end of the observation period, and the clips were removed to evaluate their biodegradability. We also evaluated their effect on the living body by blood biochemistry data. The magnesium alloy clip formed much fewer artifacts than the titanium clip, and it was almost absorbed at 6 months postoperative. There were no postoperative complications and no elevation of constituent elements such as magnesium, calcium, and zinc during the observation period in both groups. The novel magnesium alloy clip demonstrated sufficient sealing capability for the cystic duct and proper biodegradability in canine models. The magnesium alloy clip revealed much fewer metallic artifacts in CT than the conventional titanium clip. Copyright © 2016 Elsevier Inc. All rights reserved.
Automatic removal of eye-movement and blink artifacts from EEG signals.
Gao, Jun Feng; Yang, Yong; Lin, Pan; Wang, Pei; Zheng, Chong Xun
2010-03-01
Frequent occurrence of electrooculography (EOG) artifacts leads to serious problems in interpreting and analyzing the electroencephalogram (EEG). In this paper, a robust method is presented to automatically eliminate eye-movement and eye-blink artifacts from EEG signals. Independent Component Analysis (ICA) is used to decompose EEG signals into independent components. Moreover, the features of topographies and power spectral densities of those components are extracted to identify eye-movement artifact components, and a support vector machine (SVM) classifier is adopted because it has higher performance than several other classifiers. The classification results show that feature-extraction methods are unsuitable for identifying eye-blink artifact components, and then a novel peak detection algorithm of independent component (PDAIC) is proposed to identify eye-blink artifact components. Finally, the artifact removal method proposed here is evaluated by the comparisons of EEG data before and after artifact removal. The results indicate that the method proposed could remove EOG artifacts effectively from EEG signals with little distortion of the underlying brain signals.
Functional Near-Infrared Spectroscopy Signals Measure Neuronal Activity in the Cortex
NASA Technical Reports Server (NTRS)
Harrivel, Angela; Hearn, Tristan
2013-01-01
Functional near infrared spectroscopy (fNIRS) is an emerging optical neuroimaging technology that indirectly measures neuronal activity in the cortex via neurovascular coupling. It quantifies hemoglobin concentration ([Hb]) and thus measures the same hemodynamic response as functional magnetic resonance imaging (fMRI), but is portable, non-confining, relatively inexpensive, and is appropriate for long-duration monitoring and use at the bedside. Like fMRI, it is noninvasive and safe for repeated measurements. Patterns of [Hb] changes are used to classify cognitive state. Thus, fNIRS technology offers much potential for application in operational contexts. For instance, the use of fNIRS to detect the mental state of commercial aircraft operators in near real time could allow intelligent flight decks of the future to optimally support human performance in the interest of safety by responding to hazardous mental states of the operator. However, many opportunities remain for improving robustness and reliability. It is desirable to reduce the impact of motion and poor optical coupling of probes to the skin. Such artifacts degrade signal quality and thus cognitive state classification accuracy. Field application calls for further development of algorithms and filters for the automation of bad channel detection and dynamic artifact removal. This work introduces a novel adaptive filter method for automated real-time fNIRS signal quality detection and improvement. The output signal (after filtering) will have had contributions from motion and poor coupling reduced or removed, thus leaving a signal more indicative of changes due to hemodynamic brain activations of interest. Cognitive state classifications based on these signals reflect brain activity more reliably. The filter has been tested successfully with both synthetic and real human subject data, and requires no auxiliary measurement. This method could be implemented as a real-time filtering option or bad channel rejection feature of software used with frequency domain fNIRS instruments for signal acquisition and processing. Use of this method could improve the reliability of any operational or real-world application of fNIRS in which motion is an inherent part of the functional task of interest. Other optical diagnostic techniques (e.g., for NIR medical diagnosis) also may benefit from the reduction of probe motion artifact during any use in which motion avoidance would be impractical or limit usability.
Nitzken, Matthew; Bajaj, Nihit; Aslan, Sevda; Gimel’farb, Georgy; Ovechkin, Alexander
2013-01-01
Surface Electromyography (EMG) is a standard method used in clinical practice and research to assess motor function in order to help with the diagnosis of neuromuscular pathology in human and animal models. EMG recorded from trunk muscles involved in the activity of breathing can be used as a direct measure of respiratory motor function in patients with spinal cord injury (SCI) or other disorders associated with motor control deficits. However, EMG potentials recorded from these muscles are often contaminated with heart-induced electrocardiographic (ECG) signals. Elimination of these artifacts plays a critical role in the precise measure of the respiratory muscle electrical activity. This study was undertaken to find an optimal approach to eliminate the ECG artifacts from EMG recordings. Conventional global filtering can be used to decrease the ECG-induced artifact. However, this method can alter the EMG signal and changes physiologically relevant information. We hypothesize that, unlike global filtering, localized removal of ECG artifacts will not change the original EMG signals. We develop an approach to remove the ECG artifacts without altering the amplitude and frequency components of the EMG signal by using an externally recorded ECG signal as a mask to locate areas of the ECG spikes within EMG data. These segments containing ECG spikes were decomposed into 128 sub-wavelets by a custom-scaled Morlet Wavelet Transform. The ECG-related sub-wavelets at the ECG spike location were removed and a de-noised EMG signal was reconstructed. Validity of the proposed method was proven using mathematical simulated synthetic signals and EMG obtained from SCI patients. We compare the Root-mean Square Error and the Relative Change in Variance between this method, global, notch and adaptive filters. The results show that the localized wavelet-based filtering has the benefit of not introducing error in the native EMG signal and accurately removing ECG artifacts from EMG signals. PMID:24307920
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M; Kang, S; Lee, S
Purpose: Implant-supported dentures seem particularly appropriate for the predicament of becoming edentulous and cancer patients are no exceptions. As the number of people having dental implants increased in different ages, critical dosimetric verification of metal artifact effects are required for the more accurate head and neck radiation therapy. The purpose of this study is to verify the theoretical analysis of the metal(streak and dark) artifact, and to evaluate dosimetric effect which cause by dental implants in CT images of patients with the patient teeth and implants inserted humanoid phantom. Methods: The phantom comprises cylinder which is shaped to simulate themore » anatomical structures of a human head and neck. Through applying various clinical cases, made phantom which is closely allied to human. Developed phantom can verify two classes: (i)closed mouth (ii)opened mouth. RapidArc plans of 4 cases were created in the Eclipse planning system. Total dose of 2000 cGy in 10 fractions is prescribed to the whole planning target volume (PTV) using 6MV photon beams. Acuros XB (AXB) advanced dose calculation algorithm, Analytical Anisotropic Algorithm (AAA) and progressive resolution optimizer were used in dose optimization and calculation. Results: In closed and opened mouth phantom, because dark artifacts formed extensively around the metal implants, dose variation was relatively higher than that of streak artifacts. As the PTV was delineated on the dark regions or large streak artifact regions, maximum 7.8% dose error and average 3.2% difference was observed. The averaged minimum dose to the PTV predicted by AAA was about 5.6% higher and OARs doses are also 5.2% higher compared to AXB. Conclusion: The results of this study showed that AXB dose calculation involving high-density materials is more accurate than AAA calculation, and AXB was superior to AAA in dose predictions beyond dark artifact/air cavity portion when compared against the measurements.« less
Nitzken, Matthew; Bajaj, Nihit; Aslan, Sevda; Gimel'farb, Georgy; El-Baz, Ayman; Ovechkin, Alexander
2013-07-18
Surface Electromyography (EMG) is a standard method used in clinical practice and research to assess motor function in order to help with the diagnosis of neuromuscular pathology in human and animal models. EMG recorded from trunk muscles involved in the activity of breathing can be used as a direct measure of respiratory motor function in patients with spinal cord injury (SCI) or other disorders associated with motor control deficits. However, EMG potentials recorded from these muscles are often contaminated with heart-induced electrocardiographic (ECG) signals. Elimination of these artifacts plays a critical role in the precise measure of the respiratory muscle electrical activity. This study was undertaken to find an optimal approach to eliminate the ECG artifacts from EMG recordings. Conventional global filtering can be used to decrease the ECG-induced artifact. However, this method can alter the EMG signal and changes physiologically relevant information. We hypothesize that, unlike global filtering, localized removal of ECG artifacts will not change the original EMG signals. We develop an approach to remove the ECG artifacts without altering the amplitude and frequency components of the EMG signal by using an externally recorded ECG signal as a mask to locate areas of the ECG spikes within EMG data. These segments containing ECG spikes were decomposed into 128 sub-wavelets by a custom-scaled Morlet Wavelet Transform. The ECG-related sub-wavelets at the ECG spike location were removed and a de-noised EMG signal was reconstructed. Validity of the proposed method was proven using mathematical simulated synthetic signals and EMG obtained from SCI patients. We compare the Root-mean Square Error and the Relative Change in Variance between this method, global, notch and adaptive filters. The results show that the localized wavelet-based filtering has the benefit of not introducing error in the native EMG signal and accurately removing ECG artifacts from EMG signals.
A Planetarium Inside Your Office: Virtual Reality in the Dome Production Pipeline
NASA Astrophysics Data System (ADS)
Summers, Frank
2018-01-01
Producing astronomy visualization sequences for a planetarium without ready access to a dome is a distorted geometric challenge. Fortunately, one can now use virtual reality (VR) to simulate a dome environment without ever leaving one's office chair. The VR dome experience has proven to be a more than suitable pre-visualization method that requires only modest amounts of processing beyond the standard production pipeline. It also provides a crucial testbed for identifying, testing, and fixing the visual constraints and artifacts that arise in a spherical presentation environment. Topics adreesed here will include rendering, geometric projection, movie encoding, software playback, and hardware setup for a virtual dome using VR headsets.
Automation of Physiologic Data Presentation and Alarms in the Post Anesthesia Care Unit
Aukburg, S.J.; Ketikidis, P.H.; Kitz, D.S.; Mavrides, T.G.; Matschinsky, B.B.
1989-01-01
The routine use of pulse oximeters, non-invasive blood pressure monitors and electrocardiogram monitors have considerably improved patient care in the post anesthesia period. Using an automated data collection system, we investigated the occurrence of several adverse events frequently revealed by these monitors. We found that the incidence of hypoxia was 35%, hypertension 12%, hypotension 8%, tachycardia 25% and bradycardia 1%. Discriminant analysis was able to correctly predict classification of about 90% of patients into normal vs. hypotensive or hypotensive groups. The system software minimizes artifact, validates data for epidemiologic studies, and is able to identify variables that predict adverse events through application of appropriate statistical and artificial intelligence techniques.
NASA Astrophysics Data System (ADS)
Garces, E. L.; Garces, M. A.; Christe, A.
2017-12-01
The RedVox infrasound recorder app uses microphones and barometers in smartphones to record infrasound, low-frequency sound below the threshold of human hearing. We study a device's metadata, which includes position, latency time, the differences between the device's internal times and the server times, and the machine time, searching for patterns and possible errors or discontinuities in these scaled parameters. We highlight metadata variability through scaled multivariate displays (histograms, distribution curves, scatter plots), all created and organized through software development in Python. This project is helpful in ascertaining variability and honing the accuracy of smartphones, aiding the emergence of portable devices as viable geophysical data collection instruments. It can also improve the app and cloud service by increasing efficiency and accuracy, allowing to better document and foresee drastic natural movements like tsunamis, earthquakes, volcanic eruptions, storms, rocket launches, and meteor impacts; recorded data can later be used for studies and analysis by a variety of professions. We expect our final results to produce insight on how to counteract problematic issues in data mining and improve accuracy in smartphone data-collection. By eliminating lurking variables and minimizing the effect of confounding variables, we hope to discover efficient processes to reduce superfluous precision, unnecessary errors, and data artifacts. These methods should conceivably be transferable to other areas of software development, data analytics, and statistics-based experiments, contributing a precedent of smartphone metadata studies from geophysical rather than societal data. The results should facilitate the rise of civilian-accessible, hand-held, data-gathering mobile sensor networks and yield more straightforward data mining techniques.
Detection of artifacts from high energy bursts in neonatal EEG.
Bhattacharyya, Sourya; Biswas, Arunava; Mukherjee, Jayanta; Majumdar, Arun Kumar; Majumdar, Bandana; Mukherjee, Suchandra; Singh, Arun Kumar
2013-11-01
Detection of non-cerebral activities or artifacts, intermixed within the background EEG, is essential to discard them from subsequent pattern analysis. The problem is much harder in neonatal EEG, where the background EEG contains spikes, waves, and rapid fluctuations in amplitude and frequency. Existing artifact detection methods are mostly limited to detect only a subset of artifacts such as ocular, muscle or power line artifacts. Few methods integrate different modules, each for detection of one specific category of artifact. Furthermore, most of the reference approaches are implemented and tested on adult EEG recordings. Direct application of those methods on neonatal EEG causes performance deterioration, due to greater pattern variation and inherent complexity. A method for detection of a wide range of artifact categories in neonatal EEG is thus required. At the same time, the method should be specific enough to preserve the background EEG information. The current study describes a feature based classification approach to detect both repetitive (generated from ECG, EMG, pulse, respiration, etc.) and transient (generated from eye blinking, eye movement, patient movement, etc.) artifacts. It focuses on artifact detection within high energy burst patterns, instead of detecting artifacts within the complete background EEG with wide pattern variation. The objective is to find true burst patterns, which can later be used to identify the Burst-Suppression (BS) pattern, which is commonly observed during newborn seizure. Such selective artifact detection is proven to be more sensitive to artifacts and specific to bursts, compared to the existing artifact detection approaches applied on the complete background EEG. Several time domain, frequency domain, statistical features, and features generated by wavelet decomposition are analyzed to model the proposed bi-classification between burst and artifact segments. A feature selection method is also applied to select the feature subset producing highest classification accuracy. The suggested feature based classification method is executed using our recorded neonatal EEG dataset, consisting of burst and artifact segments. We obtain 78% sensitivity and 72% specificity as the accuracy measures. The accuracy obtained using the proposed method is found to be about 20% higher than that of the reference approaches. Joint use of the proposed method with our previous work on burst detection outperforms reference methods on simultaneous burst and artifact detection. As the proposed method supports detection of a wide range of artifact patterns, it can be improved to incorporate the detection of artifacts within other seizure patterns and background EEG information as well. © 2013 Elsevier Ltd. All rights reserved.
Golden-ratio rotated stack-of-stars acquisition for improved volumetric MRI.
Zhou, Ziwu; Han, Fei; Yan, Lirong; Wang, Danny J J; Hu, Peng
2017-12-01
To develop and evaluate an improved stack-of-stars radial sampling strategy for reducing streaking artifacts. The conventional stack-of-stars sampling strategy collects the same radial angle for every partition (slice) encoding. In an undersampled acquisition, such an aligned acquisition generates coherent aliasing patterns and introduces strong streaking artifacts. We show that by rotating the radial spokes in a golden-angle manner along the partition-encoding direction, the aliasing pattern is modified, resulting in improved image quality for gridding and more advanced reconstruction methods. Computer simulations were performed and phantom as well as in vivo images for three different applications were acquired. Simulation, phantom, and in vivo experiments confirmed that the proposed method was able to generate images with less streaking artifact and sharper structures based on undersampled acquisitions in comparison with the conventional aligned approach at the same acceleration factors. By combining parallel imaging and compressed sensing in the reconstruction, streaking artifacts were mostly removed with improved delineation of fine structures using the proposed strategy. We present a simple method to reduce streaking artifacts and improve image quality in 3D stack-of-stars acquisitions by re-arranging the radial spoke angles in the 3D partition direction, which can be used for rapid volumetric imaging. Magn Reson Med 78:2290-2298, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
NASA Tech Briefs, February 2003
NASA Technical Reports Server (NTRS)
2003-01-01
opics covered include: Integrated Electrode Arrays for Neuro-Prosthetic Implants; Eroding Potentiometers; Common/Dependent-Pressure-Vessel Nickel-Hydrogen Batteries; 120-GHz HEMT Oscillator With Surface-Wave-Assisted Antenna; 80-GHz MMIC HEMT Voltage-Controlled Oscillator; High-Energy-Density Capacitors; Microscale Thermal-Transpiration Gas Pump; Instrument for Measuring Temperature of Water; Improved Measurement of Coherence in Presence of Instrument Noise; Compact Instruments Measure Helium-Leak Rates; Irreversible Entropy Production in Two-Phase Mixing Layers; Subsonic and Supersonic Effects in Bose-Einstein Condensate; Nanolaminate Mirrors With "Piston" Figure-Control Actuators; Mixed Conducting Electrodes for Better AMTEC Cells; Process for Encapsulating Protein Crystals; Lightweight, Self-Deployable Wheels; Grease-Resistant O Rings for Joints in Solid Rocket Motors; LabVIEW Serial Driver Software for an Electronic Load; Software Computes Tape-Casting Parameters; Software for Tracking Costs of Mars Projects; Software for Replicating Data Between X.500 and LDAP Directories; The Technical Work Plan Tracking Tool; Improved Multiple-DOF SAW Piezoelectric Motors; Propulsion Flight-Test Fixture; Mechanical Amplifier for a Piezoelectric Transducer; Swell Sleeves for Testing Explosive Devices; Linear Back-Drive Differentials; Miniature Inchworm Actuators Fabricated by Use of LIGA; Using ERF Devices to Control Deployments of Space Structures; High-Temperature Switched-Reluctance Electric Motor; System for Centering a Turbofan in a Nacelle During Tests; Fabricating Composite-Material Structures Containing SMA Ribbons; Optimal Feedback Control of Thermal Networks; Artifacts for Calibration of Submicron Width Measurements; Navigating a Mobile Robot Across Terrain Using Fuzzy Logic; Designing Facilities for Collaborative Operations; and Quantitating Iron in Serum Ferritin by Use of ICP-MS.
Chang, Hing-Chiu; Chen, Nan-kuei
2016-01-01
Diffusion-weighted imaging (DWI) obtained with interleaved echo-planar imaging (EPI) pulse sequence has great potential of characterizing brain tissue properties at high spatial-resolution. However, interleaved EPI based DWI data may be corrupted by various types of aliasing artifacts. First, inconsistencies in k-space data obtained with opposite readout gradient polarities result in Nyquist artifact, which is usually reduced with 1D phase correction in post-processing. When there exist eddy current cross terms (e.g., in oblique-plane EPI), 2D phase correction is needed to effectively reduce Nyquist artifact. Second, minuscule motion induced phase inconsistencies in interleaved DWI scans result in image-domain aliasing artifact, which can be removed with reconstruction procedures that take shot-to-shot phase variations into consideration. In existing interleaved DWI reconstruction procedures, Nyquist artifact and minuscule motion-induced aliasing artifact are typically removed subsequently in two stages. Although the two-stage phase correction generally performs well for non-oblique plane EPI data obtained from well-calibrated system, the residual artifacts may still be pronounced in oblique-plane EPI data or when there exist eddy current cross terms. To address this challenge, here we report a new composite 2D phase correction procedure, which effective removes Nyquist artifact and minuscule motion induced aliasing artifact jointly in a single step. Our experimental results demonstrate that the new 2D phase correction method can much more effectively reduce artifacts in interleaved EPI based DWI data as compared with the existing two-stage artifact correction procedures. The new method robustly enables high-resolution DWI, and should prove highly valuable for clinical uses and research studies of DWI. PMID:27114342
Huang, Chao-Tsung; Wang, Yu-Wen; Huang, Li-Ren; Chin, Jui; Chen, Liang-Gee
2017-02-01
Digital refocusing has a tradeoff between complexity and quality when using sparsely sampled light fields for low-storage applications. In this paper, we propose a fast physically correct refocusing algorithm to address this issue in a twofold way. First, view interpolation is adopted to provide photorealistic quality at infocus-defocus hybrid boundaries. Regarding its conventional high complexity, we devised a fast line-scan method specifically for refocusing, and its 1D kernel can be 30× faster than the benchmark View Synthesis Reference Software (VSRS)-1D-Fast. Second, we propose a block-based multi-rate processing flow for accelerating purely infocused or defocused regions, and a further 3- 34× speedup can be achieved for high-resolution images. All candidate blocks of variable sizes can interpolate different numbers of rendered views and perform refocusing in different subsampled layers. To avoid visible aliasing and block artifacts, we determine these parameters and the simulated aperture filter through a localized filter response analysis using defocus blur statistics. The final quadtree block partitions are then optimized in terms of computation time. Extensive experimental results are provided to show superior refocusing quality and fast computation speed. In particular, the run time is comparable with the conventional single-image blurring, which causes serious boundary artifacts.
LP DAAC MEaSUREs Project Artifact Tracking Via the NASA Earthdata Collaboration Environment
NASA Astrophysics Data System (ADS)
Bennett, S. D.
2015-12-01
The Land Processes Distributed Active Archive Center (LP DAAC) is a NASA Earth Observing System (EOS) Data and Information System (EOSDIS) DAAC that supports selected EOS Community non-standard data products such as the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Emissivity Database (GED), and also supports NASA Earth Science programs such as Making Earth System Data Records for Use in Research Environments (MEaSUREs) to contribute in providing long-term, consistent, and mature data products. As described in The LP DAAC Project Lifecycle Plan (Daucsavage, J.; Bennett, S., 2014), key elements within the Project Inception Phase fuse knowledge between NASA stakeholders, data producers, and NASA data providers. To support and deliver excellence for NASA data stewardship, and to accommodate long-tail data preservation with Community and MEaSUREs products, the LP DAAC is utilizing NASA's own Earthdata Collaboration Environment to bridge stakeholder communication divides. By leveraging a NASA supported platform, this poster describes how the Atlassian Confluence software combined with a NASA URS/Earthdata support can maintain each project's members, status, documentation, and artifact checklist. Furthermore, this solution provides a gateway for project communities to become familiar with NASA clients, as well as educating the project's NASA DAAC Scientists for NASA client distribution.
Fang, Jieming; Zhang, Da; Wilcox, Carol; Heidinger, Benedikt; Raptopoulos, Vassilios; Brook, Alexander; Brook, Olga R
2017-03-01
To assess single energy metal artifact reduction (SEMAR) and spectral energy metal artifact reduction (MARS) algorithms in reducing artifacts generated by different metal implants. Phantom was scanned with and without SEMAR (Aquilion One, Toshiba) and MARS (Discovery CT750 HD, GE), with various metal implants. Images were evaluated objectively by measuring standard deviation in regions of interests and subjectively by two independent reviewers grading on a scale of 0 (no artifact) to 4 (severe artifact). Reviewers also graded new artifacts introduced by metal artifact reduction algorithms. SEMAR and MARS significantly decreased variability of the density measurement adjacent to the metal implant, with median SD (standard deviation of density measurement) of 52.1 HU without SEMAR, vs. 12.3 HU with SEMAR, p < 0.001. Median SD without MARS of 63.1 HU decreased to 25.9 HU with MARS, p < 0.001. Median SD with SEMAR is significantly lower than median SD with MARS (p = 0.0011). SEMAR improved subjective image quality with reduction in overall artifacts grading from 3.2 ± 0.7 to 1.4 ± 0.9, p < 0.001. Improvement of overall image quality by MARS has not reached statistical significance (3.2 ± 0.6 to 2.6 ± 0.8, p = 0.088). There was a significant introduction of artifacts introduced by metal artifact reduction algorithm for MARS with 2.4 ± 1.0, but minimal with SEMAR 0.4 ± 0.7, p < 0.001. CT iterative reconstruction algorithms with single and spectral energy are both effective in reduction of metal artifacts. Single energy-based algorithm provides better overall image quality than spectral CT-based algorithm. Spectral metal artifact reduction algorithm introduces mild to moderate artifacts in the far field.
Kandala, Sridhar; Nolan, Dan; Laumann, Timothy O.; Power, Jonathan D.; Adeyemo, Babatunde; Harms, Michael P.; Petersen, Steven E.; Barch, Deanna M.
2016-01-01
Abstract Like all resting-state functional connectivity data, the data from the Human Connectome Project (HCP) are adversely affected by structured noise artifacts arising from head motion and physiological processes. Functional connectivity estimates (Pearson's correlation coefficients) were inflated for high-motion time points and for high-motion participants. This inflation occurred across the brain, suggesting the presence of globally distributed artifacts. The degree of inflation was further increased for connections between nearby regions compared with distant regions, suggesting the presence of distance-dependent spatially specific artifacts. We evaluated several denoising methods: censoring high-motion time points, motion regression, the FMRIB independent component analysis-based X-noiseifier (FIX), and mean grayordinate time series regression (MGTR; as a proxy for global signal regression). The results suggest that FIX denoising reduced both types of artifacts, but left substantial global artifacts behind. MGTR significantly reduced global artifacts, but left substantial spatially specific artifacts behind. Censoring high-motion time points resulted in a small reduction of distance-dependent and global artifacts, eliminating neither type. All denoising strategies left differences between high- and low-motion participants, but only MGTR substantially reduced those differences. Ultimately, functional connectivity estimates from HCP data showed spatially specific and globally distributed artifacts, and the most effective approach to address both types of motion-correlated artifacts was a combination of FIX and MGTR. PMID:27571276
NASA Astrophysics Data System (ADS)
Kuniyil Ajith Singh, Mithun; Jaeger, Michael; Frenz, Martin; Steenbergen, Wiendelt
2016-03-01
Reflection artifacts caused by acoustic inhomogeneities are a main challenge to deep-tissue photoacoustic imaging. Photoacoustic transients generated by the skin surface and superficial vasculature will propagate into the tissue and reflect back from echogenic structures to generate reflection artifacts. These artifacts can cause problems in image interpretation and limit imaging depth. In its basic version, PAFUSion mimics the inward travelling wave-field from blood vessel-like PA sources by applying focused ultrasound pulses, and thus provides a way to identify reflection artifacts. In this work, we demonstrate reflection artifact correction in addition to identification, towards obtaining an artifact-free photoacoustic image. In view of clinical applications, we implemented an improved version of PAFUSion in which photoacoustic data is backpropagated to imitate the inward travelling wave-field and thus the reflection artifacts of a more arbitrary distribution of PA sources that also includes the skin melanin layer. The backpropagation is performed in a synthetic way based on the pulse-echo acquisitions after transmission on each single element of the transducer array. We present a phantom experiment and initial in vivo measurements on human volunteers where we demonstrate significant reflection artifact reduction using our technique. The results provide a direct confirmation that reflection artifacts are prominent in clinical epi-photoacoustic imaging, and that PAFUSion can reduce these artifacts significantly to improve the deep-tissue photoacoustic imaging.
Removal of ring artifacts in microtomography by characterization of scintillator variations.
Vågberg, William; Larsson, Jakob C; Hertz, Hans M
2017-09-18
Ring artifacts reduce image quality in tomography, and arise from faulty detector calibration. In microtomography, we have identified that ring artifacts can arise due to high-spatial frequency variations in the scintillator thickness. Such variations are normally removed by a flat-field correction. However, as the spectrum changes, e.g. due to beam hardening, the detector response varies non-uniformly introducing ring artifacts that persist after flat-field correction. In this paper, we present a method to correct for ring artifacts from variations in scintillator thickness by using a simple method to characterize the local scintillator response. The method addresses the actual physical cause of the ring artifacts, in contrary to many other ring artifact removal methods which rely only on image post-processing. By applying the technique to an experimental phantom tomography, we show that ring artifacts are strongly reduced compared to only making a flat-field correction.
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images.
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Background: Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. Methods: In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Results: Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Conclusions: Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images. PMID:29535920
2009-01-01
breast are radiotracer uptake by heart and liver. Glandular Tissue Implant Biopsy Clip Streak Artifact Adipose Tissue FIGURE 10: CmT reconstructed... adipose tissue and implants. The cylindrical artifact is due to the offset geometry (further explanation in Task 2(a)). The second patient was...and Image Reconstruction Our emission tomography system is composed of a compact 16x20cm2 field of view cadmium zinc telluride (CZT) LumaGEM