Computer-aided detection of initial polyp candidates with level set-based adaptive convolution
NASA Astrophysics Data System (ADS)
Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong
2009-02-01
In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.
Computer-Aided Diagnostic (CAD) Scheme by Use of Contralateral Subtraction Technique
NASA Astrophysics Data System (ADS)
Nagashima, Hiroyuki; Harakawa, Tetsumi
We developed a computer-aided diagnostic (CAD) scheme for detection of subtle image findings of acute cerebral infarction in brain computed tomography (CT) by using a contralateral subtraction technique. In our computerized scheme, the lateral inclination of image was first corrected automatically by rotating and shifting. The contralateral subtraction image was then derived by subtraction of reversed image from original image. Initial candidates for acute cerebral infarctions were identified using the multiple-thresholding and image filtering techniques. As the 1st step for removing false positive candidates, fourteen image features were extracted in each of the initial candidates. Halfway candidates were detected by applying the rule-based test with these image features. At the 2nd step, five image features were extracted using the overlapping scale with halfway candidates in interest slice and upper/lower slice image. Finally, acute cerebral infarction candidates were detected by applying the rule-based test with five image features. The sensitivity in the detection for 74 training cases was 97.4% with 3.7 false positives per image. The performance of CAD scheme for 44 testing cases had an approximate result to training cases. Our CAD scheme using the contralateral subtraction technique can reveal suspected image findings of acute cerebral infarctions in CT images.
Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin
2017-01-01
The purpose of this study is to evaluate a new method to improve performance of computer-aided detection (CAD) schemes of screening mammograms with two approaches. In the first approach, we developed a new case based CAD scheme using a set of optimally selected global mammographic density, texture, spiculation, and structural similarity features computed from all four full-field digital mammography (FFDM) images of the craniocaudal (CC) and mediolateral oblique (MLO) views by using a modified fast and accurate sequential floating forward selection feature selection algorithm. Selected features were then applied to a “scoring fusion” artificial neural network (ANN) classification scheme to produce a final case based risk score. In the second approach, we combined the case based risk score with the conventional lesion based scores of a conventional lesion based CAD scheme using a new adaptive cueing method that is integrated with the case based risk scores. We evaluated our methods using a ten-fold cross-validation scheme on 924 cases (476 cancer and 448 recalled or negative), whereby each case had all four images from the CC and MLO views. The area under the receiver operating characteristic curve was AUC = 0.793±0.015 and the odds ratio monotonically increased from 1 to 37.21 as CAD-generated case based detection scores increased. Using the new adaptive cueing method, the region based and case based sensitivities of the conventional CAD scheme at a false positive rate of 0.71 per image increased by 2.4% and 0.8%, respectively. The study demonstrated that supplementary information can be derived by computing global mammographic density image features to improve CAD-cueing performance on the suspicious mammographic lesions. PMID:27997380
NASA Astrophysics Data System (ADS)
Gaffney, Kevin P.; Aghaei, Faranak; Battiste, James; Zheng, Bin
2017-03-01
Detection of residual brain tumor is important to evaluate efficacy of brain cancer surgery, determine optimal strategy of further radiation therapy if needed, and assess ultimate prognosis of the patients. Brain MR is a commonly used imaging modality for this task. In order to distinguish between residual tumor and surgery induced scar tissues, two sets of MRI scans are conducted pre- and post-gadolinium contrast injection. The residual tumors are only enhanced in the post-contrast injection images. However, subjective reading and quantifying this type of brain MR images faces difficulty in detecting real residual tumor regions and measuring total volume of the residual tumor. In order to help solve this clinical difficulty, we developed and tested a new interactive computer-aided detection scheme, which consists of three consecutive image processing steps namely, 1) segmentation of the intracranial region, 2) image registration and subtraction, 3) tumor segmentation and refinement. The scheme also includes a specially designed and implemented graphical user interface (GUI) platform. When using this scheme, two sets of pre- and post-contrast injection images are first automatically processed to detect and quantify residual tumor volume. Then, a user can visually examine segmentation results and conveniently guide the scheme to correct any detection or segmentation errors if needed. The scheme has been repeatedly tested using five cases. Due to the observed high performance and robustness of the testing results, the scheme is currently ready for conducting clinical studies and helping clinicians investigate the association between this quantitative image marker and outcome of patients.
Park, Sang Cheol; Chapman, Brian E; Zheng, Bin
2011-06-01
This study developed a computer-aided detection (CAD) scheme for pulmonary embolism (PE) detection and investigated several approaches to improve CAD performance. In the study, 20 computed tomography examinations with various lung diseases were selected, which include 44 verified PE lesions. The proposed CAD scheme consists of five basic steps: 1) lung segmentation; 2) PE candidate extraction using an intensity mask and tobogganing region growing; 3) PE candidate feature extraction; 4) false-positive (FP) reduction using an artificial neural network (ANN); and 5) a multifeature-based k-nearest neighbor for positive/negative classification. In this study, we also investigated the following additional methods to improve CAD performance: 1) grouping 2-D detected features into a single 3-D object; 2) selecting features with a genetic algorithm (GA); and 3) limiting the number of allowed suspicious lesions to be cued in one examination. The results showed that 1) CAD scheme using tobogganing, an ANN, and grouping method achieved the maximum detection sensitivity of 79.2%; 2) the maximum scoring method achieved the superior performance over other scoring fusion methods; 3) GA was able to delete "redundant" features and further improve CAD performance; and 4) limiting the maximum number of cued lesions in an examination reduced FP rate by 5.3 times. Combining these approaches, CAD scheme achieved 63.2% detection sensitivity with 18.4 FP lesions per examination. The study suggested that performance of CAD schemes for PE detection depends on many factors that include 1) optimizing the 2-D region grouping and scoring methods; 2) selecting the optimal feature set; and 3) limiting the number of allowed cueing lesions per examination.
Computer-Aided Diagnosis in Medical Imaging: Historical Review, Current Status and Future Potential
Doi, Kunio
2007-01-01
Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. In this article, the motivation and philosophy for early development of CAD schemes are presented together with the current status and future potential of CAD in a PACS environment. With CAD, radiologists use the computer output as a “second opinion” and make the final decisions. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral chest images has the potential to improve the overall performance in the detection of lung nodules when combined with another CAD scheme for PA chest images. Because vertebral fractures can be detected reliably by computer on lateral chest radiographs, radiologists’ accuracy in the detection of vertebral fractures would be improved by the use of CAD, and thus early diagnosis of osteoporosis would become possible. In MRA, a CAD system has been developed for assisting radiologists in the detection of intracranial aneurysms. On successive bone scan images, a CAD scheme for detection of interval changes has been developed by use of temporal subtraction images. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for chest CAD may include the computerized detection of lung nodules, interstitial opacities, cardiomegaly, vertebral fractures, and interval changes in chest radiographs as well as the computerized classification of benign and malignant nodules and the differential diagnosis of interstitial lung diseases. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with known pathology, which would be very similar to a new unknown case, from PACS when a reliable and useful method has been developed for quantifying the similarity of a pair of images for visual comparison by radiologists. PMID:17349778
Shi, Zhenghao; Ma, Jiejue; Feng, Yaning; He, Lifeng; Suzuki, Kenji
2015-11-01
MTANN (Massive Training Artificial Neural Network) is a promising tool, which applied to eliminate false-positive for thoracic CT in recent years. In order to evaluate whether this method is feasible to eliminate false-positive of different CAD schemes, especially, when it is applied to commercial CAD software, this paper evaluate the performance of the method for eliminating false-positives produced by three different versions of commercial CAD software for lung nodules detection in chest radiographs. Experimental results demonstrate that the approach is useful in reducing FPs for different computer aided lung nodules detection software in chest radiographs.
A new approach to develop computer-aided detection schemes of digital mammograms
NASA Astrophysics Data System (ADS)
Tan, Maxine; Qian, Wei; Pu, Jiantao; Liu, Hong; Zheng, Bin
2015-06-01
The purpose of this study is to develop a new global mammographic image feature analysis based computer-aided detection (CAD) scheme and evaluate its performance in detecting positive screening mammography examinations. A dataset that includes images acquired from 1896 full-field digital mammography (FFDM) screening examinations was used in this study. Among them, 812 cases were positive for cancer and 1084 were negative or benign. After segmenting the breast area, a computerized scheme was applied to compute 92 global mammographic tissue density based features on each of four mammograms of the craniocaudal (CC) and mediolateral oblique (MLO) views. After adding three existing popular risk factors (woman’s age, subjectively rated mammographic density, and family breast cancer history) into the initial feature pool, we applied a sequential forward floating selection feature selection algorithm to select relevant features from the bilateral CC and MLO view images separately. The selected CC and MLO view image features were used to train two artificial neural networks (ANNs). The results were then fused by a third ANN to build a two-stage classifier to predict the likelihood of the FFDM screening examination being positive. CAD performance was tested using a ten-fold cross-validation method. The computed area under the receiver operating characteristic curve was AUC = 0.779 ± 0.025 and the odds ratio monotonically increased from 1 to 31.55 as CAD-generated detection scores increased. The study demonstrated that this new global image feature based CAD scheme had a relatively higher discriminatory power to cue the FFDM examinations with high risk of being positive, which may provide a new CAD-cueing method to assist radiologists in reading and interpreting screening mammograms.
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin
2018-05-01
This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC = 0.65 ± 0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p < 0.01). Thus, this study demonstrated that CAD-generated false-positives might include valuable information, which needs to be further explored for identifying and/or developing more effective imaging markers for predicting short-term breast cancer risk.
Ji-Wook Jeong; Seung-Hoon Chae; Eun Young Chae; Hak Hee Kim; Young Wook Choi; Sooyeul Lee
2016-08-01
A computer-aided detection (CADe) algorithm for clustered microcalcifications (MCs) in reconstructed digital breast tomosynthesis (DBT) images is suggested. The MC-like objects were enhanced by a Hessian-based 3D calcification response function, and a signal-to-noise ratio (SNR) enhanced image was also generated to screen the MC clustering seed objects. A connected component segmentation method was used to detect the cluster seed objects, which were considered as potential clustering centers of MCs. Bounding cubes for the accepted clustering seed candidate were generated and the overlapping cubes were combined and examined. After the MC clustering and false-positive (FP) reduction step, the average number of FPs was estimated to be 0.87 per DBT volume with a sensitivity of 90.5%.
A novel scheme to aid coherent detection of GMSK signals in fast Rayleigh fading channels
NASA Technical Reports Server (NTRS)
Leung, Patrick S. K.; Feher, Kamilo
1990-01-01
A novel scheme to insert carrier pilot to Gaussian Minimum Shift Keying (GMSK) signal using Binary Block Code (BBC) and a highpass filter in baseband is proposed. This allows the signal to be coherently demodulated even in a fast Rayleigh fading environment. As an illustrative example, the scheme is applied to a 16 kb/s GMSK signal, and its performance over a fast Rayleigh fading channel is investigated using computer simulation. This modem's 'irreducible error rate' is found to be Pe = 5.5 x 10(exp -5) which is more than that of differential detection. The modem's performance in Rician fading channel is currently under investigation.
NASA Astrophysics Data System (ADS)
Nishikawa, Robert M.; Giger, Maryellen L.; Doi, Kunio; Vyborny, Carl J.; Schmidt, Robert A.; Metz, Charles E.; Wu, Chris Y.; Yin, Fang-Fang; Jiang, Yulei; Huo, Zhimin; Lu, Ping; Zhang, Wei; Ema, Takahiro; Bick, Ulrich; Papaioannou, John; Nagel, Rufus H.
1993-07-01
We are developing an 'intelligent' workstation to assist radiologists in diagnosing breast cancer from mammograms. The hardware for the workstation will consist of a film digitizer, a high speed computer, a large volume storage device, a film printer, and 4 high resolution CRT monitors. The software for the workstation is a comprehensive package of automated detection and classification schemes. Two rule-based detection schemes have been developed, one for breast masses and the other for clustered microcalcifications. The sensitivity of both schemes is 85% with a false-positive rate of approximately 3.0 and 1.5 false detections per image, for the mass and cluster detection schemes, respectively. Computerized classification is performed by an artificial neural network (ANN). The ANN has a sensitivity of 100% with a specificity of 60%. Currently, the ANN, which is a three-layer, feed-forward network, requires as input ratings of 14 different radiographic features of the mammogram that were determined subjectively by a radiologist. We are in the process of developing automated techniques to objectively determine these 14 features. The workstation will be placed in the clinical reading area of the radiology department in the near future, where controlled clinical tests will be performed to measure its efficacy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Sheng; Suzuki, Kenji; MacMahon, Heber
2011-04-15
Purpose: To develop a computer-aided detection (CADe) scheme for nodules in chest radiographs (CXRs) with a high sensitivity and a low false-positive (FP) rate. Methods: The authors developed a CADe scheme consisting of five major steps, which were developed for improving the overall performance of CADe schemes. First, to segment the lung fields accurately, the authors developed a multisegment active shape model. Then, a two-stage nodule-enhancement technique was developed for improving the conspicuity of nodules. Initial nodule candidates were detected and segmented by using the clustering watershed algorithm. Thirty-one shape-, gray-level-, surface-, and gradient-based features were extracted from each segmentedmore » candidate for determining the feature space, including one of the new features based on the Canny edge detector to eliminate a major FP source caused by rib crossings. Finally, a nonlinear support vector machine (SVM) with a Gaussian kernel was employed for classification of the nodule candidates. Results: To evaluate and compare the scheme to other published CADe schemes, the authors used a publicly available database containing 140 nodules in 140 CXRs and 93 normal CXRs. The CADe scheme based on the SVM classifier achieved sensitivities of 78.6% (110/140) and 71.4% (100/140) with averages of 5.0 (1165/233) FPs/image and 2.0 (466/233) FPs/image, respectively, in a leave-one-out cross-validation test, whereas the CADe scheme based on a linear discriminant analysis classifier had a sensitivity of 60.7% (85/140) at an FP rate of 5.0 FPs/image. For nodules classified as ''very subtle'' and ''extremely subtle,'' a sensitivity of 57.1% (24/42) was achieved at an FP rate of 5.0 FPs/image. When the authors used a database developed at the University of Chicago, the sensitivities was 83.3% (40/48) and 77.1% (37/48) at an FP rate of 5.0 (240/48) FPs/image and 2.0 (96/48) FPs /image, respectively. Conclusions: These results compare favorably to those described for other commercial and noncommercial CADe nodule detection systems.« less
Computer-aided classification of breast masses using contrast-enhanced digital mammograms
NASA Astrophysics Data System (ADS)
Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin
2018-02-01
By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (p<0.01). Since DES images eliminate overlapping effect of dense breast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.
An Automatic Detection System of Lung Nodule Based on Multi-Group Patch-Based Deep Learning Network.
Jiang, Hongyang; Ma, He; Qian, Wei; Gao, Mengdi; Li, Yan
2017-07-14
High-efficiency lung nodule detection dramatically contributes to the risk assessment of lung cancer. It is a significant and challenging task to quickly locate the exact positions of lung nodules. Extensive work has been done by researchers around this domain for approximately two decades. However, previous computer aided detection (CADe) schemes are mostly intricate and time-consuming since they may require more image processing modules, such as the computed tomography (CT) image transformation, the lung nodule segmentation and the feature extraction, to construct a whole CADe system. It is difficult for those schemes to process and analyze enormous data when the medical images continue to increase. Besides, some state of the art deep learning schemes may be strict in the standard of database. This study proposes an effective lung nodule detection scheme based on multi-group patches cut out from the lung images, which are enhanced by the Frangi filter. Through combining two groups of images, a four-channel convolution neural networks (CNN) model is designed to learn the knowledge of radiologists for detecting nodules of four levels. This CADe scheme can acquire the sensitivity of 80.06% with 4.7 false positives per scan and the sensitivity of 94% with 15.1 false positives per scan. The results demonstrate that the multi-group patch-based learning system is efficient to improve the performance of lung nodule detection and greatly reduce the false positives under a huge amount of image data.
Robust Spacecraft Component Detection in Point Clouds.
Wei, Quanmao; Jiang, Zhiguo; Zhang, Haopeng
2018-03-21
Automatic component detection of spacecraft can assist in on-orbit operation and space situational awareness. Spacecraft are generally composed of solar panels and cuboidal or cylindrical modules. These components can be simply represented by geometric primitives like plane, cuboid and cylinder. Based on this prior, we propose a robust automatic detection scheme to automatically detect such basic components of spacecraft in three-dimensional (3D) point clouds. In the proposed scheme, cylinders are first detected in the iteration of the energy-based geometric model fitting and cylinder parameter estimation. Then, planes are detected by Hough transform and further described as bounded patches with their minimum bounding rectangles. Finally, the cuboids are detected with pair-wise geometry relations from the detected patches. After successive detection of cylinders, planar patches and cuboids, a mid-level geometry representation of the spacecraft can be delivered. We tested the proposed component detection scheme on spacecraft 3D point clouds synthesized by computer-aided design (CAD) models and those recovered by image-based reconstruction, respectively. Experimental results illustrate that the proposed scheme can detect the basic geometric components effectively and has fine robustness against noise and point distribution density.
Robust Spacecraft Component Detection in Point Clouds
Wei, Quanmao; Jiang, Zhiguo
2018-01-01
Automatic component detection of spacecraft can assist in on-orbit operation and space situational awareness. Spacecraft are generally composed of solar panels and cuboidal or cylindrical modules. These components can be simply represented by geometric primitives like plane, cuboid and cylinder. Based on this prior, we propose a robust automatic detection scheme to automatically detect such basic components of spacecraft in three-dimensional (3D) point clouds. In the proposed scheme, cylinders are first detected in the iteration of the energy-based geometric model fitting and cylinder parameter estimation. Then, planes are detected by Hough transform and further described as bounded patches with their minimum bounding rectangles. Finally, the cuboids are detected with pair-wise geometry relations from the detected patches. After successive detection of cylinders, planar patches and cuboids, a mid-level geometry representation of the spacecraft can be delivered. We tested the proposed component detection scheme on spacecraft 3D point clouds synthesized by computer-aided design (CAD) models and those recovered by image-based reconstruction, respectively. Experimental results illustrate that the proposed scheme can detect the basic geometric components effectively and has fine robustness against noise and point distribution density. PMID:29561828
Reduction of bias and variance for evaluation of computer-aided diagnostic schemes.
Li, Qiang; Doi, Kunio
2006-04-01
Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists in detecting various lesions in medical images. In addition to the development, an equally important problem is the reliable evaluation of the performance levels of various CAD schemes. It is good to see that more and more investigators are employing more reliable evaluation methods such as leave-one-out and cross validation, instead of less reliable methods such as resubstitution, for assessing their CAD schemes. However, the common applications of leave-one-out and cross-validation evaluation methods do not necessarily imply that the estimated performance levels are accurate and precise. Pitfalls often occur in the use of leave-one-out and cross-validation evaluation methods, and they lead to unreliable estimation of performance levels. In this study, we first identified a number of typical pitfalls for the evaluation of CAD schemes, and conducted a Monte Carlo simulation experiment for each of the pitfalls to demonstrate quantitatively the extent of bias and/or variance caused by the pitfall. Our experimental results indicate that considerable bias and variance may exist in the estimated performance levels of CAD schemes if one employs various flawed leave-one-out and cross-validation evaluation methods. In addition, for promoting and utilizing a high standard for reliable evaluation of CAD schemes, we attempt to make recommendations, whenever possible, for overcoming these pitfalls. We believe that, with the recommended evaluation methods, we can considerably reduce the bias and variance in the estimated performance levels of CAD schemes.
Gong, Jing; Liu, Ji-Yu; Sun, Xi-Wen; Zheng, Bin; Nie, Sheng-Dong
2018-02-05
This study aims to develop a computer-aided diagnosis (CADx) scheme for classification between malignant and benign lung nodules, and also assess whether CADx performance changes in detecting nodules associated with early and advanced stage lung cancer. The study involves 243 biopsy-confirmed pulmonary nodules. Among them, 76 are benign, 81 are stage I and 86 are stage III malignant nodules. The cases are separated into three data sets involving: (1) all nodules, (2) benign and stage I malignant nodules, and (3) benign and stage III malignant nodules. A CADx scheme is applied to segment lung nodules depicted on computed tomography images and we initially computed 66 3D image features. Then, three machine learning models namely, a support vector machine, naïve Bayes classifier and linear discriminant analysis, are separately trained and tested by using three data sets and a leave-one-case-out cross-validation method embedded with a Relief-F feature selection algorithm. When separately using three data sets to train and test three classifiers, the average areas under receiver operating characteristic curves (AUC) are 0.94, 0.90 and 0.99, respectively. When using the classifiers trained using data sets with all nodules, average AUC values are 0.88 and 0.99 for detecting early and advanced stage nodules, respectively. AUC values computed from three classifiers trained using the same data set are consistent without statistically significant difference (p > 0.05). This study demonstrates (1) the feasibility of applying a CADx scheme to accurately distinguish between benign and malignant lung nodules, and (2) a positive trend between CADx performance and cancer progression stage. Thus, in order to increase CADx performance in detecting subtle and early cancer, training data sets should include more diverse early stage cancer cases.
NASA Astrophysics Data System (ADS)
Gong, Jing; Liu, Ji-Yu; Sun, Xi-Wen; Zheng, Bin; Nie, Sheng-Dong
2018-02-01
This study aims to develop a computer-aided diagnosis (CADx) scheme for classification between malignant and benign lung nodules, and also assess whether CADx performance changes in detecting nodules associated with early and advanced stage lung cancer. The study involves 243 biopsy-confirmed pulmonary nodules. Among them, 76 are benign, 81 are stage I and 86 are stage III malignant nodules. The cases are separated into three data sets involving: (1) all nodules, (2) benign and stage I malignant nodules, and (3) benign and stage III malignant nodules. A CADx scheme is applied to segment lung nodules depicted on computed tomography images and we initially computed 66 3D image features. Then, three machine learning models namely, a support vector machine, naïve Bayes classifier and linear discriminant analysis, are separately trained and tested by using three data sets and a leave-one-case-out cross-validation method embedded with a Relief-F feature selection algorithm. When separately using three data sets to train and test three classifiers, the average areas under receiver operating characteristic curves (AUC) are 0.94, 0.90 and 0.99, respectively. When using the classifiers trained using data sets with all nodules, average AUC values are 0.88 and 0.99 for detecting early and advanced stage nodules, respectively. AUC values computed from three classifiers trained using the same data set are consistent without statistically significant difference (p > 0.05). This study demonstrates (1) the feasibility of applying a CADx scheme to accurately distinguish between benign and malignant lung nodules, and (2) a positive trend between CADx performance and cancer progression stage. Thus, in order to increase CADx performance in detecting subtle and early cancer, training data sets should include more diverse early stage cancer cases.
NASA Astrophysics Data System (ADS)
Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Qian, Wei; Zheng, Bin
2016-03-01
Current commercialized CAD schemes have high false-positive (FP) detection rates and also have high correlations in positive lesion detection with radiologists. Thus, we recently investigated a new approach to improve the efficacy of applying CAD to assist radiologists in reading and interpreting screening mammograms. Namely, we developed a new global feature based CAD approach/scheme that can cue the warning sign on the cases with high risk of being positive. In this study, we investigate the possibility of fusing global feature or case-based scores with the local or lesion-based CAD scores using an adaptive cueing method. We hypothesize that the information from the global feature extraction (features extracted from the whole breast regions) are different from and can provide supplementary information to the locally-extracted features (computed from the segmented lesion regions only). On a large and diverse full-field digital mammography (FFDM) testing dataset with 785 cases (347 negative and 438 cancer cases with masses only), we ran our lesion-based and case-based CAD schemes "as is" on the whole dataset. To assess the supplementary information provided by the global features, we used an adaptive cueing method to adaptively adjust the original CAD-generated detection scores (Sorg) of a detected suspicious mass region based on the computed case-based score (Scase) of the case associated with this detected region. Using the adaptive cueing method, better sensitivity results were obtained at lower FP rates (<= 1 FP per image). Namely, increases of sensitivities (in the FROC curves) of up to 6.7% and 8.2% were obtained for the ROI and Case-based results, respectively.
Computer-aided Classification of Mammographic Masses Using Visually Sensitive Image Features
Wang, Yunzhi; Aghaei, Faranak; Zarafshani, Ali; Qiu, Yuchen; Qian, Wei; Zheng, Bin
2017-01-01
Purpose To develop a new computer-aided diagnosis (CAD) scheme that computes visually sensitive image features routinely used by radiologists to develop a machine learning classifier and distinguish between the malignant and benign breast masses detected from digital mammograms. Methods An image dataset including 301 breast masses was retrospectively selected. From each segmented mass region, we computed image features that mimic five categories of visually sensitive features routinely used by radiologists in reading mammograms. We then selected five optimal features in the five feature categories and applied logistic regression models for classification. A new CAD interface was also designed to show lesion segmentation, computed feature values and classification score. Results Areas under ROC curves (AUC) were 0.786±0.026 and 0.758±0.027 when to classify mass regions depicting on two view images, respectively. By fusing classification scores computed from two regions, AUC increased to 0.806±0.025. Conclusion This study demonstrated a new approach to develop CAD scheme based on 5 visually sensitive image features. Combining with a “visual aid” interface, CAD results may be much more easily explainable to the observers and increase their confidence to consider CAD generated classification results than using other conventional CAD approaches, which involve many complicated and visually insensitive texture features. PMID:27911353
Computer-aided Instructional System for Transmission Line Simulation.
ERIC Educational Resources Information Center
Reinhard, Erwin A.; Roth, Charles H., Jr.
A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…
Mazurowski, Maciej A; Lo, Joseph Y; Harrawood, Brian P; Tourassi, Georgia D
2011-01-01
Development of a computational decision aid for a new medical imaging modality typically is a long and complicated process. It consists of collecting data in the form of images and annotations, development of image processing and pattern recognition algorithms for analysis of the new images and finally testing of the resulting system. Since new imaging modalities are developed more rapidly than ever before, any effort for decreasing the time and cost of this development process could result in maximizing the benefit of the new imaging modality to patients by making the computer aids quickly available to radiologists that interpret the images. In this paper, we make a step in this direction and investigate the possibility of translating the knowledge about the detection problem from one imaging modality to another. Specifically, we present a computer-aided detection (CAD) system for mammographic masses that uses a mutual information-based template matching scheme with intelligently selected templates. We presented principles of template matching with mutual information for mammography before. In this paper, we present an implementation of those principles in a complete computer-aided detection system. The proposed system, through an automatic optimization process, chooses the most useful templates (mammographic regions of interest) using a large database of previously collected and annotated mammograms. Through this process, the knowledge about the task of detecting masses in mammograms is incorporated in the system. Then we evaluate whether our system developed for screen-film mammograms can be successfully applied not only to other mammograms but also to digital breast tomosynthesis (DBT) reconstructed slices without adding any DBT cases for training. Our rationale is that since mutual information is known to be a robust intermodality image similarity measure, it has high potential of transferring knowledge between modalities in the context of the mass detection task. Experimental evaluation of the system on mammograms showed competitive performance compared to other mammography CAD systems recently published in the literature. When the system was applied “as-is” to DBT, its performance was notably worse than that for mammograms. However, with a simple additional preprocessing step, the performance of the system reached levels similar to that obtained for mammograms. In conclusion, the presented CAD system not only performed competitively on screen-film mammograms but it also performed robustly on DBT showing that direct transfer of knowledge across breast imaging modalities for mass detection is in fact possible. PMID:21554985
Optical tomographic detection of rheumatoid arthritis with computer-aided classification schemes
NASA Astrophysics Data System (ADS)
Klose, Christian D.; Klose, Alexander D.; Netz, Uwe; Beuthan, Jürgen; Hielscher, Andreas H.
2009-02-01
A recent research study has shown that combining multiple parameters, drawn from optical tomographic images, leads to better classification results to identifying human finger joints that are affected or not affected by rheumatic arthritis RA. Building up on the research findings of the previous study, this article presents an advanced computer-aided classification approach for interpreting optical image data to detect RA in finger joints. Additional data are used including, for example, maximum and minimum values of the absorption coefficient as well as their ratios and image variances. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index and area under the curve AUC. Results were compared to different benchmarks ("gold standard"): magnet resonance, ultrasound and clinical evaluation. Maximum accuracies (AUC=0.88) were reached when combining minimum/maximum-ratios and image variances and using ultrasound as gold standard.
Li, Feng
2015-07-01
This review paper is based on our research experience in the past 30 years. The importance of radiologists' role is discussed in the development or evaluation of new medical images and of computer-aided detection (CAD) schemes in chest radiology. The four main topics include (1) introducing what diseases can be included in a research database for different imaging techniques or CAD systems and what imaging database can be built by radiologists, (2) understanding how radiologists' subjective judgment can be combined with technical objective features to improve CAD performance, (3) sharing our experience in the design of successful observer performance studies, and (4) finally, discussing whether the new images and CAD systems can improve radiologists' diagnostic ability in chest radiology. In conclusion, advanced imaging techniques and detection/classification of CAD systems have a potential clinical impact on improvement of radiologists' diagnostic ability, for both the detection and the differential diagnosis of various lung diseases, in chest radiology.
NASA Astrophysics Data System (ADS)
Viswanath, Satish; Tiwari, Pallavi; Rosen, Mark; Madabhushi, Anant
2008-03-01
Recently, in vivo Magnetic Resonance Imaging (MRI) and Magnetic Resonance Spectroscopy (MRS) have emerged as promising new modalities to aid in prostate cancer (CaP) detection. MRI provides anatomic and structural information of the prostate while MRS provides functional data pertaining to biochemical concentrations of metabolites such as creatine, choline and citrate. We have previously presented a hierarchical clustering scheme for CaP detection on in vivo prostate MRS and have recently developed a computer-aided method for CaP detection on in vivo prostate MRI. In this paper we present a novel scheme to develop a meta-classifier to detect CaP in vivo via quantitative integration of multimodal prostate MRS and MRI by use of non-linear dimensionality reduction (NLDR) methods including spectral clustering and locally linear embedding (LLE). Quantitative integration of multimodal image data (MRI and PET) involves the concatenation of image intensities following image registration. However multimodal data integration is non-trivial when the individual modalities include spectral and image intensity data. We propose a data combination solution wherein we project the feature spaces (image intensities and spectral data) associated with each of the modalities into a lower dimensional embedding space via NLDR. NLDR methods preserve the relationships between the objects in the original high dimensional space when projecting them into the reduced low dimensional space. Since the original spectral and image intensity data are divorced from their original physical meaning in the reduced dimensional space, data at the same spatial location can be integrated by concatenating the respective embedding vectors. Unsupervised consensus clustering is then used to partition objects into different classes in the combined MRS and MRI embedding space. Quantitative results of our multimodal computer-aided diagnosis scheme on 16 sets of patient data obtained from the ACRIN trial, for which corresponding histological ground truth for spatial extent of CaP is known, show a marginally higher sensitivity, specificity, and positive predictive value compared to corresponding CAD results with the individual modalities.
NASA Astrophysics Data System (ADS)
An, Chan-Ho; Yang, Janghoon; Jang, Seunghun; Kim, Dong Ku
In this letter, a pre-processed lattice reduction (PLR) scheme is developed for the lattice reduction aided (LRA) detection of multiple input multiple-output (MIMO) systems in spatially correlated channel. The PLR computes the LLL-reduced matrix of the equivalent matrix, which is the product of the present channel matrix and unimodular transformation matrix for LR of spatial correlation matrix, rather than the present channel matrix itself. In conjunction with PLR followed by recursive lattice reduction (RLR) scheme [7], pre-processed RLR (PRLR) is shown to efficiently carry out the LR of the channel matrix, especially for the burst packet message in spatially and temporally correlated channel while matching the performance of conventional LRA detection.
Improved biliary detection and diagnosis through intelligent machine analysis.
Logeswaran, Rajasvaran
2012-09-01
This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Automatic epileptic seizure detection in EEGs using MF-DFA, SVM based on cloud computing.
Zhang, Zhongnan; Wen, Tingxi; Huang, Wei; Wang, Meihong; Li, Chunfeng
2017-01-01
Epilepsy is a chronic disease with transient brain dysfunction that results from the sudden abnormal discharge of neurons in the brain. Since electroencephalogram (EEG) is a harmless and noninvasive detection method, it plays an important role in the detection of neurological diseases. However, the process of analyzing EEG to detect neurological diseases is often difficult because the brain electrical signals are random, non-stationary and nonlinear. In order to overcome such difficulty, this study aims to develop a new computer-aided scheme for automatic epileptic seizure detection in EEGs based on multi-fractal detrended fluctuation analysis (MF-DFA) and support vector machine (SVM). New scheme first extracts features from EEG by MF-DFA during the first stage. Then, the scheme applies a genetic algorithm (GA) to calculate parameters used in SVM and classify the training data according to the selected features using SVM. Finally, the trained SVM classifier is exploited to detect neurological diseases. The algorithm utilizes MLlib from library of SPARK and runs on cloud platform. Applying to a public dataset for experiment, the study results show that the new feature extraction method and scheme can detect signals with less features and the accuracy of the classification reached up to 99%. MF-DFA is a promising approach to extract features for analyzing EEG, because of its simple algorithm procedure and less parameters. The features obtained by MF-DFA can represent samples as well as traditional wavelet transform and Lyapunov exponents. GA can always find useful parameters for SVM with enough execution time. The results illustrate that the classification model can achieve comparable accuracy, which means that it is effective in epileptic seizure detection.
Computerized Detection of Lung Nodules by Means of “Virtual Dual-Energy” Radiography
Chen, Sheng; Suzuki, Kenji
2014-01-01
Major challenges in current computer-aided detection (CADe) schemes for nodule detection in chest radiographs (CXRs) are to detect nodules that overlap with ribs and/or clavicles and to reduce the frequent false positives (FPs) caused by ribs. Detection of such nodules by a CADe scheme is very important, because radiologists are likely to miss such subtle nodules. Our purpose in this study was to develop a CADe scheme with improved sensitivity and specificity by use of “virtual dual-energy” (VDE) CXRs where ribs and clavicles are suppressed with massive-training artificial neural networks (MTANNs). To reduce rib-induced FPs and detect nodules overlapping with ribs, we incorporated the VDE technology in our CADe scheme. The VDE technology suppressed rib and clavicle opacities in CXRs while maintaining soft-tissue opacity by use of the MTANN technique that had been trained with real dual-energy imaging. Our scheme detected nodule candidates on VDE images by use of a morphologic filtering technique. Sixty morphologic and gray-level-based features were extracted from each candidate from both original and VDE CXRs. A nonlinear support vector classifier was employed for classification of the nodule candidates. A publicly available database containing 140 nodules in 140 CXRs and 93 normal CXRs was used for testing our CADe scheme. All nodules were confirmed by computed tomography examinations, and the average size of the nodules was 17.8 mm. Thirty percent (42/140) of the nodules were rated “extremely subtle” or “very subtle” by a radiologist. The original scheme without VDE technology achieved a sensitivity of 78.6% (110/140) with 5 (1165/233) FPs per image. By use of the VDE technology, more nodules overlapping with ribs or clavicles were detected and the sensitivity was improved substantially to 85.0% (119/140) at the same FP rate in a leave-one-out cross-validation test, whereas the FP rate was reduced to 2.5 (583/233) per image at the same sensitivity level as the original CADe scheme obtained (Difference between the specificities of the original and the VDE-based CADe schemes was statistically significant). In particular, the sensitivity of our VDE-based CADe scheme for subtle nodules (66.7% = 28/42) was statistically significantly higher than that of the original CADe scheme (57.1% = 24/42). Therefore, by use of VDE technology, the sensitivity and specificity of our CADe scheme for detection of nodules, especially subtle nodules, in CXRs were improved substantially. PMID:23193306
Computerized detection of lung nodules by means of "virtual dual-energy" radiography.
Chen, Sheng; Suzuki, Kenji
2013-02-01
Major challenges in current computer-aided detection (CADe) schemes for nodule detection in chest radiographs (CXRs) are to detect nodules that overlap with ribs and/or clavicles and to reduce the frequent false positives (FPs) caused by ribs. Detection of such nodules by a CADe scheme is very important, because radiologists are likely to miss such subtle nodules. Our purpose in this study was to develop a CADe scheme with improved sensitivity and specificity by use of "virtual dual-energy" (VDE) CXRs where ribs and clavicles are suppressed with massive-training artificial neural networks (MTANNs). To reduce rib-induced FPs and detect nodules overlapping with ribs, we incorporated the VDE technology in our CADe scheme. The VDE technology suppressed rib and clavicle opacities in CXRs while maintaining soft-tissue opacity by use of the MTANN technique that had been trained with real dual-energy imaging. Our scheme detected nodule candidates on VDE images by use of a morphologic filtering technique. Sixty morphologic and gray-level-based features were extracted from each candidate from both original and VDE CXRs. A nonlinear support vector classifier was employed for classification of the nodule candidates. A publicly available database containing 140 nodules in 140 CXRs and 93 normal CXRs was used for testing our CADe scheme. All nodules were confirmed by computed tomography examinations, and the average size of the nodules was 17.8 mm. Thirty percent (42/140) of the nodules were rated "extremely subtle" or "very subtle" by a radiologist. The original scheme without VDE technology achieved a sensitivity of 78.6% (110/140) with 5 (1165/233) FPs per image. By use of the VDE technology, more nodules overlapping with ribs or clavicles were detected and the sensitivity was improved substantially to 85.0% (119/140) at the same FP rate in a leave-one-out cross-validation test, whereas the FP rate was reduced to 2.5 (583/233) per image at the same sensitivity level as the original CADe scheme obtained (Difference between the specificities of the original and the VDE-based CADe schemes was statistically significant). In particular, the sensitivity of our VDE-based CADe scheme for subtle nodules (66.7% = 28/42) was statistically significantly higher than that of the original CADe scheme (57.1% = 24/42). Therefore, by use of VDE technology, the sensitivity and specificity of our CADe scheme for detection of nodules, especially subtle nodules, in CXRs were improved substantially.
An interactive system for computer-aided diagnosis of breast masses.
Wang, Xingwei; Li, Lihua; Liu, Wei; Xu, Weidong; Lederman, Dror; Zheng, Bin
2012-10-01
Although mammography is the only clinically accepted imaging modality for screening the general population to detect breast cancer, interpreting mammograms is difficult with lower sensitivity and specificity. To provide radiologists "a visual aid" in interpreting mammograms, we developed and tested an interactive system for computer-aided detection and diagnosis (CAD) of mass-like cancers. Using this system, an observer can view CAD-cued mass regions depicted on one image and then query any suspicious regions (either cued or not cued by CAD). CAD scheme automatically segments the suspicious region or accepts manually defined region and computes a set of image features. Using content-based image retrieval (CBIR) algorithm, CAD searches for a set of reference images depicting "abnormalities" similar to the queried region. Based on image retrieval results and a decision algorithm, a classification score is assigned to the queried region. In this study, a reference database with 1,800 malignant mass regions and 1,800 benign and CAD-generated false-positive regions was used. A modified CBIR algorithm with a new function of stretching the attributes in the multi-dimensional space and decision scheme was optimized using a genetic algorithm. Using a leave-one-out testing method to classify suspicious mass regions, we compared the classification performance using two CBIR algorithms with either equally weighted or optimally stretched attributes. Using the modified CBIR algorithm, the area under receiver operating characteristic curve was significantly increased from 0.865 ± 0.006 to 0.897 ± 0.005 (p < 0.001). This study demonstrated the feasibility of developing an interactive CAD system with a large reference database and achieving improved performance.
NASA Astrophysics Data System (ADS)
de Oliveira, Helder C. R.; Mencattini, Arianna; Casti, Paola; Martinelli, Eugenio; di Natale, Corrado; Catani, Juliana H.; de Barros, Nestor; Melo, Carlos F. E.; Gonzaga, Adilson; Vieira, Marcelo A. C.
2018-02-01
This paper proposes a method to reduce the number of false-positives (FP) in a computer-aided detection (CAD) scheme for automated detection of architectural distortion (AD) in digital mammography. AD is a subtle contraction of breast parenchyma that may represent an early sign of breast cancer. Due to its subtlety and variability, AD is more difficult to detect compared to microcalcifications and masses, and is commonly found in retrospective evaluations of false-negative mammograms. Several computer-based systems have been proposed for automated detection of AD in breast images. The usual approach is automatically detect possible sites of AD in a mammographic image (segmentation step) and then use a classifier to eliminate the false-positives and identify the suspicious regions (classification step). This paper focus on the optimization of the segmentation step to reduce the number of FPs that is used as input to the classifier. The proposal is to use statistical measurements to score the segmented regions and then apply a threshold to select a small quantity of regions that should be submitted to the classification step, improving the detection performance of a CAD scheme. We evaluated 12 image features to score and select suspicious regions of 74 clinical Full-Field Digital Mammography (FFDM). All images in this dataset contained at least one region with AD previously marked by an expert radiologist. The results showed that the proposed method can reduce the false positives of the segmentation step of the CAD scheme from 43.4 false positives (FP) per image to 34.5 FP per image, without increasing the number of false negatives.
Suzuki, Kenji
2009-09-21
Computer-aided diagnosis (CAD) has been an active area of study in medical image analysis. A filter for the enhancement of lesions plays an important role for improving the sensitivity and specificity in CAD schemes. The filter enhances objects similar to a model employed in the filter; e.g. a blob-enhancement filter based on the Hessian matrix enhances sphere-like objects. Actual lesions, however, often differ from a simple model; e.g. a lung nodule is generally modeled as a solid sphere, but there are nodules of various shapes and with internal inhomogeneities such as a nodule with spiculations and ground-glass opacity. Thus, conventional filters often fail to enhance actual lesions. Our purpose in this study was to develop a supervised filter for the enhancement of actual lesions (as opposed to a lesion model) by use of a massive-training artificial neural network (MTANN) in a CAD scheme for detection of lung nodules in CT. The MTANN filter was trained with actual nodules in CT images to enhance actual patterns of nodules. By use of the MTANN filter, the sensitivity and specificity of our CAD scheme were improved substantially. With a database of 69 lung cancers, nodule candidate detection by the MTANN filter achieved a 97% sensitivity with 6.7 false positives (FPs) per section, whereas nodule candidate detection by a difference-image technique achieved a 96% sensitivity with 19.3 FPs per section. Classification-MTANNs were applied for further reduction of the FPs. The classification-MTANNs removed 60% of the FPs with a loss of one true positive; thus, it achieved a 96% sensitivity with 2.7 FPs per section. Overall, with our CAD scheme based on the MTANN filter and classification-MTANNs, an 84% sensitivity with 0.5 FPs per section was achieved.
Variations in measured performance of CAD schemes due to database composition and scoring protocol
NASA Astrophysics Data System (ADS)
Nishikawa, Robert M.; Yarusso, Laura M.
1998-06-01
There is now a large effort towards developing computer- aided diagnosis (CAD) techniques. It is important to be able to compare performance of different approaches to be able to determine which ones are the most efficacious. There are currently a number of barriers preventing meaningful (statistical) comparisons, two of which are discussed in this paper: database composition and scoring protocol. We have examined how the choice of cases used to test a CAD scheme can affect its performance. We found that our computer scheme varied between a sensitivity of 100% to 77%, at a false-positive rate of 1.0 per image, with only 100% change in the composition of the database. To evaluate the performance of a CAD scheme the output of the computer must be graded. There are a number of different criteria that are being used by different investigators. We have found that for the same set of detection results, the measured sensitivity can be between 40 - 90% depending on the scoring methodology. Clearly consensus must be reached on these two issues in order for the field to make rapid progress. As it stands now, it is not possible to make meaningful comparisons of different techniques.
Deep learning aided decision support for pulmonary nodules diagnosing: a review.
Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping; He, Jianxing; Liu, Bo
2018-04-01
Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing.
Can computer-aided diagnosis (CAD) help radiologists find mammographically missed screening cancers?
NASA Astrophysics Data System (ADS)
Nishikawa, Robert M.; Giger, Maryellen L.; Schmidt, Robert A.; Papaioannou, John
2001-06-01
We present data from a pilot observer study whose goal is design a study to test the hypothesis that computer-aided diagnosis (CAD) can improve radiologists' performance in reading screening mammograms. In a prospective evaluation of our computer detection schemes, we have analyzed over 12,000 clinical exams. Retrospective review of the negative screening mammograms for all cancer cases found an indication of the cancer in 23 of these negative cases. The computer found 54% of these in our prospective testing. We added to these cases normal exams to create a dataset of 75 cases. Four radiologists experienced in mammography read the cases and gave their BI-RADS assessment and their confidence that the patient should be called back for diagnostic mammography. They did so once reading the films only and a second time reading with the computer aid. Three radiologists had no change in area under the ROC curve (mean Az of 0.73) and one improved from 0.73 to 0.78, but this difference failed to reach statistical significance (p equals 0.23). These data are being used to plan a larger more powerful study.
Learning-based image preprocessing for robust computer-aided detection
NASA Astrophysics Data System (ADS)
Raghupathi, Laks; Devarakota, Pandu R.; Wolf, Matthias
2013-03-01
Recent studies have shown that low dose computed tomography (LDCT) can be an effective screening tool to reduce lung cancer mortality. Computer-aided detection (CAD) would be a beneficial second reader for radiologists in such cases. Studies demonstrate that while iterative reconstructions (IR) improve LDCT diagnostic quality, it however degrades CAD performance significantly (increased false positives) when applied directly. For improving CAD performance, solutions such as retraining with newer data or applying a standard preprocessing technique may not be suffice due to high prevalence of CT scanners and non-uniform acquisition protocols. Here, we present a learning-based framework that can adaptively transform a wide variety of input data to boost an existing CAD performance. This not only enhances their robustness but also their applicability in clinical workflows. Our solution consists of applying a suitable pre-processing filter automatically on the given image based on its characteristics. This requires the preparation of ground truth (GT) of choosing an appropriate filter resulting in improved CAD performance. Accordingly, we propose an efficient consolidation process with a novel metric. Using key anatomical landmarks, we then derive consistent feature descriptors for the classification scheme that then uses a priority mechanism to automatically choose an optimal preprocessing filter. We demonstrate CAD prototype∗ performance improvement using hospital-scale datasets acquired from North America, Europe and Asia. Though we demonstrated our results for a lung nodule CAD, this scheme is straightforward to extend to other post-processing tools dedicated to other organs and modalities.
Increasing cancer detection yield of breast MRI using a new CAD scheme of mammograms
NASA Astrophysics Data System (ADS)
Tan, Maxine; Aghaei, Faranak; Hollingsworth, Alan B.; Stough, Rebecca G.; Liu, Hong; Zheng, Bin
2016-03-01
Although breast MRI is the most sensitive imaging modality to detect early breast cancer, its cancer detection yield in breast cancer screening is quite low (< 3 to 4% even for the small group of high-risk women) to date. The purpose of this preliminary study is to test the potential of developing and applying a new computer-aided detection (CAD) scheme of digital mammograms to identify women at high risk of harboring mammography-occult breast cancers, which can be detected by breast MRI. For this purpose, we retrospectively assembled a dataset involving 30 women who had both mammography and breast MRI screening examinations. All mammograms were interpreted as negative, while 5 cancers were detected using breast MRI. We developed a CAD scheme of mammograms, which include a new quantitative mammographic image feature analysis based risk model, to stratify women into two groups with high and low risk of harboring mammography-occult cancer. Among 30 women, 9 were classified into the high risk group by CAD scheme, which included all 5 women who had cancer detected by breast MRI. All 21 low risk women remained negative on the breast MRI examinations. The cancer detection yield of breast MRI applying to this dataset substantially increased from 16.7% (5/30) to 55.6% (5/9), while eliminating 84% (21/25) unnecessary breast MRI screenings. The study demonstrated the potential of applying a new CAD scheme to significantly increase cancer detection yield of breast MRI, while simultaneously reducing the number of negative MRIs in breast cancer screening.
Bi-model processing for early detection of breast tumor in CAD system
NASA Astrophysics Data System (ADS)
Mughal, Bushra; Sharif, Muhammad; Muhammad, Nazeer
2017-06-01
Early screening of skeptical masses in mammograms may reduce mortality rate among women. This rate can be further reduced upon developing the computer-aided diagnosis system with decrease in false assumptions in medical informatics. This method highlights the early tumor detection in digitized mammograms. For improving the performance of this system, a novel bi-model processing algorithm is introduced. It divides the region of interest into two parts, the first one is called pre-segmented region (breast parenchyma) and other is the post-segmented region (suspicious region). This system follows the scheme of the preprocessing technique of contrast enhancement that can be utilized to segment and extract the desired feature of the given mammogram. In the next phase, a hybrid feature block is presented to show the effective performance of computer-aided diagnosis. In order to assess the effectiveness of the proposed method, a database provided by the society of mammographic images is tested. Our experimental outcomes on this database exhibit the usefulness and robustness of the proposed method.
A prototype of mammography CADx scheme integrated to imaging quality evaluation techniques
NASA Astrophysics Data System (ADS)
Schiabel, Homero; Matheus, Bruno R. N.; Angelo, Michele F.; Patrocínio, Ana Claudia; Ventura, Liliane
2011-03-01
As all women over the age of 40 are recommended to perform mammographic exams every two years, the demands on radiologists to evaluate mammographic images in short periods of time has increased considerably. As a tool to improve quality and accelerate analysis CADe/Dx (computer-aided detection/diagnosis) schemes have been investigated, but very few complete CADe/Dx schemes have been developed and most are restricted to detection and not diagnosis. The existent ones usually are associated to specific mammographic equipment (usually DR), which makes them very expensive. So this paper describes a prototype of a complete mammography CADx scheme developed by our research group integrated to an imaging quality evaluation process. The basic structure consists of pre-processing modules based on image acquisition and digitization procedures (FFDM, CR or film + scanner), a segmentation tool to detect clustered microcalcifications and suspect masses and a classification scheme, which evaluates as the presence of microcalcifications clusters as well as possible malignant masses based on their contour. The aim is to provide enough information not only on the detected structures but also a pre-report with a BI-RADS classification. At this time the system is still lacking an interface integrating all the modules. Despite this, it is functional as a prototype for clinical practice testing, with results comparable to others reported in literature.
NASA Technical Reports Server (NTRS)
Syed, S. A.; Chiappetta, L. M.
1985-01-01
A methodological evaluation for two-finite differencing schemes for computer-aided gas turbine design is presented. The two computational schemes include; a Bounded Skewed Finite Differencing Scheme (BSUDS); and a Quadratic Upwind Differencing Scheme (QSDS). In the evaluation, the derivations of the schemes were incorporated into two-dimensional and three-dimensional versions of the Teaching Axisymmetric Characteristics Heuristically (TEACH) computer code. Assessments were made according to performance criteria for the solution of problems of turbulent, laminar, and coannular turbulent flow. The specific performance criteria used in the evaluation were simplicity, accuracy, and computational economy. It is found that the BSUDS scheme performed better with respect to the criteria than the QUDS. Some of the reasons for the more successful performance BSUDS are discussed.
A hybrid CNN feature model for pulmonary nodule malignancy risk differentiation.
Wang, Huafeng; Zhao, Tingting; Li, Lihong Connie; Pan, Haixia; Liu, Wanquan; Gao, Haoqi; Han, Fangfang; Wang, Yuehai; Qi, Yifan; Liang, Zhengrong
2018-01-01
The malignancy risk differentiation of pulmonary nodule is one of the most challenge tasks of computer-aided diagnosis (CADx). Most recently reported CADx methods or schemes based on texture and shape estimation have shown relatively satisfactory on differentiating the risk level of malignancy among the nodules detected in lung cancer screening. However, the existing CADx schemes tend to detect and analyze characteristics of pulmonary nodules from a statistical perspective according to local features only. Enlightened by the currently prevailing learning ability of convolutional neural network (CNN), which simulates human neural network for target recognition and our previously research on texture features, we present a hybrid model that takes into consideration of both global and local features for pulmonary nodule differentiation using the largest public database founded by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). By comparing three types of CNN models in which two of them were newly proposed by us, we observed that the multi-channel CNN model yielded the best discrimination in capacity of differentiating malignancy risk of the nodules based on the projection of distributions of extracted features. Moreover, CADx scheme using the new multi-channel CNN model outperformed our previously developed CADx scheme using the 3D texture feature analysis method, which increased the computed area under a receiver operating characteristic curve (AUC) from 0.9441 to 0.9702.
Computerized analysis of sonograms for the detection of breast lesions
NASA Astrophysics Data System (ADS)
Drukker, Karen; Giger, Maryellen L.; Horsch, Karla; Vyborny, Carl J.
2002-05-01
With a renewed interest in using non-ionizing radiation for the screening of high risk women, there is a clear role for a computerized detection aid in ultrasound. Thus, we are developing a computerized detection method for the localization of lesions on breast ultrasound images. The computerized detection scheme utilizes two methods. Firstly, a radial gradient index analysis is used to distinguish potential lesions from normal parenchyma. Secondly, an image skewness analysis is performed to identify posterior acoustic shadowing. We analyzed 400 cases (757 images) consisting of complex cysts, solid benign lesions, and malignant lesions. The detection method yielded an overall sensitivity of 95% by image, and 99% by case at a false-positive rate of 0.94 per image. In 51% of all images, only the lesion itself was detected, while in 5% of the images only the shadowing was identified. For malignant lesions these numbers were 37% and 9%, respectively. In summary, we have developed a computer detection method for lesions on ultrasound images of the breast, which may ultimately aid in breast cancer screening.
A new CAD approach for improving efficacy of cancer screening
NASA Astrophysics Data System (ADS)
Zheng, Bin; Qian, Wei; Li, Lihua; Pu, Jiantao; Kang, Yan; Lure, Fleming; Tan, Maxine; Qiu, Yuchen
2015-03-01
Since performance and clinical utility of current computer-aided detection (CAD) schemes of detecting and classifying soft tissue lesions (e.g., breast masses and lung nodules) is not satisfactory, many researchers in CAD field call for new CAD research ideas and approaches. The purpose of presenting this opinion paper is to share our vision and stimulate more discussions of how to overcome or compensate the limitation of current lesion-detection based CAD schemes in the CAD research community. Since based on our observation that analyzing global image information plays an important role in radiologists' decision making, we hypothesized that using the targeted quantitative image features computed from global images could also provide highly discriminatory power, which are supplementary to the lesion-based information. To test our hypothesis, we recently performed a number of independent studies. Based on our published preliminary study results, we demonstrated that global mammographic image features and background parenchymal enhancement of breast MR images carried useful information to (1) predict near-term breast cancer risk based on negative screening mammograms, (2) distinguish between true- and false-positive recalls in mammography screening examinations, and (3) classify between malignant and benign breast MR examinations. The global case-based CAD scheme only warns a risk level of the cases without cueing a large number of false-positive lesions. It can also be applied to guide lesion-based CAD cueing to reduce false-positives but enhance clinically relevant true-positive cueing. However, before such a new CAD approach is clinically acceptable, more work is needed to optimize not only the scheme performance but also how to integrate with lesion-based CAD schemes in the clinical practice.
Computerized scheme for detection of diffuse lung diseases on CR chest images
NASA Astrophysics Data System (ADS)
Pereira, Roberto R., Jr.; Shiraishi, Junji; Li, Feng; Li, Qiang; Doi, Kunio
2008-03-01
We have developed a new computer-aided diagnostic (CAD) scheme for detection of diffuse lung disease in computed radiographic (CR) chest images. One hundred ninety-four chest images (56 normals and 138 abnormals with diffuse lung diseases) were used. The 138 abnormal cases were classified into three levels of severity (34 mild, 60 moderate, and 44 severe) by an experienced chest radiologist with use of five different patterns, i.e., reticular, reticulonodular, nodular, air-space opacity, and emphysema. In our computerized scheme, the first moment of the power spectrum, the root-mean-square variation, and the average pixel value were determined for each region of interest (ROI), which was selected automatically in the lung fields. The average pixel value and its dependence on the location of the ROI were employed for identifying abnormal patterns due to air-space opacity or emphysema. A rule-based method was used for determining three levels of abnormality for each ROI (0: normal, 1: mild, 2: moderate, and 3: severe). The distinction between normal lungs and abnormal lungs with diffuse lung disease was determined based on the fractional number of abnormal ROIs by taking into account the severity of abnormalities. Preliminary results indicated that the area under the ROC curve was 0.889 for the 44 severe cases, 0.825 for the 104 severe and moderate cases, and 0.794 for all cases. We have identified a number of problems and reasons causing false positives on normal cases, and also false negatives on abnormal cases. In addition, we have discussed potential approaches for improvement of our CAD scheme. In conclusion, the CAD scheme for detection of diffuse lung diseases based on texture features extracted from CR chest images has the potential to assist radiologists in their interpretation of diffuse lung diseases.
Deep learning aided decision support for pulmonary nodules diagnosing: a review
Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping
2018-01-01
Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing. PMID:29780633
Han, Guanghui; Liu, Xiabi; Han, Feifei; Santika, I Nyoman Tenaya; Zhao, Yanfeng; Zhao, Xinming; Zhou, Chunwu
2015-02-01
Lung computed tomography (CT) imaging signs play important roles in the diagnosis of lung diseases. In this paper, we review the significance of CT imaging signs in disease diagnosis and determine the inclusion criterion of CT scans and CT imaging signs of our database. We develop the software of abnormal regions annotation and design the storage scheme of CT images and annotation data. Then, we present a publicly available database of lung CT imaging signs, called LISS for short, which contains 271 CT scans and 677 abnormal regions in them. The 677 abnormal regions are divided into nine categories of common CT imaging signs of lung disease (CISLs). The ground truth of these CISLs regions and the corresponding categories are provided. Furthermore, to make the database publicly available, all private data in CT scans are eliminated or replaced with provisioned values. The main characteristic of our LISS database is that it is developed from a new perspective of CT imaging signs of lung diseases instead of commonly considered lung nodules. Thus, it is promising to apply to computer-aided detection and diagnosis research and medical education.
Deep ensemble learning of virtual endoluminal views for polyp detection in CT colonography
NASA Astrophysics Data System (ADS)
Umehara, Kensuke; Näppi, Janne J.; Hironaka, Toru; Regge, Daniele; Ishida, Takayuki; Yoshida, Hiroyuki
2017-03-01
Robust training of a deep convolutional neural network (DCNN) requires a very large number of annotated datasets that are currently not available in CT colonography (CTC). We previously demonstrated that deep transfer learning provides an effective approach for robust application of a DCNN in CTC. However, at high detection accuracy, the differentiation of small polyps from non-polyps was still challenging. In this study, we developed and evaluated a deep ensemble learning (DEL) scheme for reviewing of virtual endoluminal images to improve the performance of computer-aided detection (CADe) of polyps in CTC. Nine different types of image renderings were generated from virtual endoluminal images of polyp candidates detected by a conventional CADe system. Eleven DCNNs that represented three types of publically available pre-trained DCNN models were re-trained by transfer learning to identify polyps from the virtual endoluminal images. A DEL scheme that determines the final detected polyps by a review of the nine types of VE images was developed by combining the DCNNs using a random forest classifier as a meta-classifier. For evaluation, we sampled 154 CTC cases from a large CTC screening trial and divided the cases randomly into a training dataset and a test dataset. At 3.9 falsepositive (FP) detections per patient on average, the detection sensitivities of the conventional CADe system, the highestperforming single DCNN, and the DEL scheme were 81.3%, 90.7%, and 93.5%, respectively, for polyps ≥6 mm in size. For small polyps, the DEL scheme reduced the number of false positives by up to 83% over that of using a single DCNN alone. These preliminary results indicate that the DEL scheme provides an effective approach for improving the polyp detection performance of CADe in CTC, especially for small polyps.
NASA Astrophysics Data System (ADS)
Hiramatsu, Yuya; Muramatsu, Chisako; Kobayashi, Hironobu; Hara, Takeshi; Fujita, Hiroshi
2017-03-01
Breast cancer screening with mammography and ultrasonography is expected to improve sensitivity compared with mammography alone, especially for women with dense breast. An automated breast volume scanner (ABVS) provides the operator-independent whole breast data which facilitate double reading and comparison with past exams, contralateral breast, and multimodality images. However, large volumetric data in screening practice increase radiologists' workload. Therefore, our goal is to develop a computer-aided detection scheme of breast masses in ABVS data for assisting radiologists' diagnosis and comparison with mammographic findings. In this study, false positive (FP) reduction scheme using deep convolutional neural network (DCNN) was investigated. For training DCNN, true positive and FP samples were obtained from the result of our initial mass detection scheme using the vector convergence filter. Regions of interest including the detected regions were extracted from the multiplanar reconstraction slices. We investigated methods to select effective FP samples for training the DCNN. Based on the free response receiver operating characteristic analysis, simple random sampling from the entire candidates was most effective in this study. Using DCNN, the number of FPs could be reduced by 60%, while retaining 90% of true masses. The result indicates the potential usefulness of DCNN for FP reduction in automated mass detection on ABVS images.
NASA Astrophysics Data System (ADS)
Uchiyama, Yoshikazu; Asano, Tatsunori; Hara, Takeshi; Fujita, Hiroshi; Kinosada, Yasutomi; Asano, Takahiko; Kato, Hiroki; Kanematsu, Masayuki; Hoshi, Hiroaki; Iwama, Toru
2009-02-01
The detection of cerebrovascular diseases such as unruptured aneurysm, stenosis, and occlusion is a major application of magnetic resonance angiography (MRA). However, their accurate detection is often difficult for radiologists. Therefore, several computer-aided diagnosis (CAD) schemes have been developed in order to assist radiologists with image interpretation. The purpose of this study was to develop a computerized method for segmenting cerebral arteries, which is an essential component of CAD schemes. For the segmentation of vessel regions, we first used a gray level transformation to calibrate voxel values. To adjust for variations in the positioning of patients, registration was subsequently employed to maximize the overlapping of the vessel regions in the target image and reference image. The vessel regions were then segmented from the background using gray-level thresholding and region growing techniques. Finally, rule-based schemes with features such as size, shape, and anatomical location were employed to distinguish between vessel regions and false positives. Our method was applied to 854 clinical cases obtained from two different hospitals. The segmentation of cerebral arteries in 97.1%(829/854) of the MRA studies was attained as an acceptable result. Therefore, our computerized method would be useful in CAD schemes for the detection of cerebrovascular diseases in MRA images.
Computer-aided diagnosis in radiological imaging: current status and future challenges
NASA Astrophysics Data System (ADS)
Doi, Kunio
2009-10-01
Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.
NASA Astrophysics Data System (ADS)
Ikedo, Yuji; Fukuoka, Daisuke; Hara, Takeshi; Fujita, Hiroshi; Takada, Etsuo; Endo, Tokiko; Morita, Takako
2007-03-01
The comparison of left and right mammograms is a common technique used by radiologists for the detection and diagnosis of masses. In mammography, computer-aided detection (CAD) schemes using bilateral subtraction technique have been reported. However, in breast ultrasonography, there are no reports on CAD schemes using comparison of left and right breasts. In this study, we propose a scheme of false positive reduction based on bilateral subtraction technique in whole breast ultrasound images. Mass candidate regions are detected by using the information of edge directions. Bilateral breast images are registered with reference to the nipple positions and skin lines. A false positive region is detected based on a comparison of the average gray values of a mass candidate region and a region with the same position and same size as the candidate region in the contralateral breast. In evaluating the effectiveness of the false positive reduction method, three normal and three abnormal bilateral pairs of whole breast images were employed. These abnormal breasts included six masses larger than 5 mm in diameter. The sensitivity was 83% (5/6) with 13.8 (165/12) false positives per breast before applying the proposed reduction method. By applying the method, false positives were reduced to 4.5 (54/12) per breast without removing a true positive region. This preliminary study indicates that the bilateral subtraction technique is effective for improving the performance of a CAD scheme in whole breast ultrasound images.
Influence of Computer-Aided Detection on Performance of Screening Mammography
Fenton, Joshua J.; Taplin, Stephen H.; Carney, Patricia A.; Abraham, Linn; Sickles, Edward A.; D'Orsi, Carl; Berns, Eric A.; Cutter, Gary; Hendrick, R. Edward; Barlow, William E.; Elmore, Joann G.
2011-01-01
Background Computer-aided detection identifies suspicious findings on mammograms to assist radiologists. Since the Food and Drug Administration approved the technology in 1998, it has been disseminated into practice, but its effect on the accuracy of interpretation is unclear. Methods We determined the association between the use of computer-aided detection at mammography facilities and the performance of screening mammography from 1998 through 2002 at 43 facilities in three states. We had complete data for 222,135 women (a total of 429,345 mammograms), including 2351 women who received a diagnosis of breast cancer within 1 year after screening. We calculated the specificity, sensitivity, and positive predictive value of screening mammography with and without computer-aided detection, as well as the rates of biopsy and breast-cancer detection and the overall accuracy, measured as the area under the receiver-operating-characteristic (ROC) curve. Results Seven facilities (16%) implemented computer-aided detection during the study period. Diagnostic specificity decreased from 90.2% before implementation to 87.2% after implementation (P<0.001), the positive predictive value decreased from 4.1% to 3.2% (P = 0.01), and the rate of biopsy increased by 19.7% (P<0.001). The increase in sensitivity from 80.4% before implementation of computer-aided detection to 84.0% after implementation was not significant (P = 0.32). The change in the cancer-detection rate (including invasive breast cancers and ductal carcinomas in situ) was not significant (4.15 cases per 1000 screening mammograms before implementation and 4.20 cases after implementation, P = 0.90). Analyses of data from all 43 facilities showed that the use of computer-aided detection was associated with significantly lower overall accuracy than was nonuse (area under the ROC curve, 0.871 vs. 0.919; P = 0.005). Conclusions The use of computer-aided detection is associated with reduced accuracy of interpretation of screening mammograms. The increased rate of biopsy with the use of computer-aided detection is not clearly associated with improved detection of invasive breast cancer. PMID:17409321
Direct measurement of nonlocal entanglement of two-qubit spin quantum states.
Cheng, Liu-Yong; Yang, Guo-Hui; Guo, Qi; Wang, Hong-Fu; Zhang, Shou
2016-01-18
We propose efficient schemes of direct concurrence measurement for two-qubit spin and photon-polarization entangled states via the interaction between single-photon pulses and nitrogen-vacancy (NV) centers in diamond embedded in optical microcavities. For different entangled-state types, diversified quantum devices and operations are designed accordingly. The initial unknown entangled states are possessed by two spatially separated participants, and nonlocal spin (polarization) entanglement can be measured with the aid of detection probabilities of photon (NV center) states. This non-demolition entanglement measurement manner makes initial entangled particle-pair avoid complete annihilation but evolve into corresponding maximally entangled states. Moreover, joint inter-qubit operation or global qubit readout is not required for the presented schemes and the final analyses inform favorable performance under the current parameters conditions in laboratory. The unique advantages of spin qubits assure our schemes wide potential applications in spin-based solid quantum information and computation.
Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.
Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar
2017-11-21
Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .
Hybrid detection of lung nodules on CT scan images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Lin; Tan, Yongqiang; Schwartz, Lawrence H.
Purpose: The diversity of lung nodules poses difficulty for the current computer-aided diagnostic (CAD) schemes for lung nodule detection on computed tomography (CT) scan images, especially in large-scale CT screening studies. We proposed a novel CAD scheme based on a hybrid method to address the challenges of detection in diverse lung nodules. Methods: The hybrid method proposed in this paper integrates several existing and widely used algorithms in the field of nodule detection, including morphological operation, dot-enhancement based on Hessian matrix, fuzzy connectedness segmentation, local density maximum algorithm, geodesic distance map, and regression tree classification. All of the adopted algorithmsmore » were organized into tree structures with multi-nodes. Each node in the tree structure aimed to deal with one type of lung nodule. Results: The method has been evaluated on 294 CT scans from the Lung Image Database Consortium (LIDC) dataset. The CT scans were randomly divided into two independent subsets: a training set (196 scans) and a test set (98 scans). In total, the 294 CT scans contained 631 lung nodules, which were annotated by at least two radiologists participating in the LIDC project. The sensitivity and false positive per scan for the training set were 87% and 2.61%. The sensitivity and false positive per scan for the testing set were 85.2% and 3.13%. Conclusions: The proposed hybrid method yielded high performance on the evaluation dataset and exhibits advantages over existing CAD schemes. We believe that the present method would be useful for a wide variety of CT imaging protocols used in both routine diagnosis and screening studies.« less
Kai, Chiharu; Uchiyama, Yoshikazu; Shiraishi, Junji; Fujita, Hiroshi; Doi, Kunio
2018-05-10
In the post-genome era, a novel research field, 'radiomics' has been developed to offer a new viewpoint for the use of genotypes in radiology and medicine research which have traditionally focused on the analysis of imaging phenotypes. The present study analyzed brain morphological changes related to the individual's genotype. Our data consisted of magnetic resonance (MR) images of patients with mild cognitive impairment (MCI) and Alzheimer's disease (AD), as well as their apolipoprotein E (APOE) genotypes. First, statistical parametric mapping (SPM) 12 was used for three-dimensional anatomical standardization of the brain MR images. A total of 30 normal images were used to create a standard normal brain image. Z-score maps were generated to identify the differences between an abnormal image and the standard normal brain. Our experimental results revealed that cerebral atrophies, depending on genotypes, can occur in different locations and that morphological changes may differ between MCI and AD. Using a classifier to characterize cerebral atrophies related to an individual's genotype, we developed a computer-aided diagnosis (CAD) scheme to identify the disease. For the early detection of cerebral diseases, a screening system using MR images, called Brain Check-up, is widely performed in Japan. Therefore, our proposed CAD scheme would be used in Brain Check-up.
Symmetry-based detection and diagnosis of DCIS in breast MRI
NASA Astrophysics Data System (ADS)
Srikantha, Abhilash; Harz, Markus T.; Newstead, Gillian; Wang, Lei; Platel, Bram; Hegenscheid, Katrin; Mann, Ritse M.; Hahn, Horst K.; Peitgen, Heinz-Otto
2013-02-01
The delineation and diagnosis of non-mass-like lesions, most notably DCIS (ductal carcinoma in situ), is among the most challenging tasks in breast MRI reading. Even for human observers, DCIS is not always easy to diferentiate from patterns of active parenchymal enhancement or from benign alterations of breast tissue. In this light, it is no surprise that CADe/CADx approaches often completely fail to classify DCIS. Of the several approaches that have tried to devise such computer aid, none achieve performances similar to mass detection and classification in terms of sensitivity and specificity. In our contribution, we show a novel approach to combine a newly proposed metric of anatomical breast symmetry calculated on subtraction images of dynamic contrast-enhanced (DCE) breast MRI, descriptive kinetic parameters, and lesion candidate morphology to achieve performances comparable to computer-aided methods used for masses. We have based the development of the method on DCE MRI data of 18 DCIS cases with hand-annotated lesions, complemented by DCE-MRI data of nine normal cases. We propose a novel metric to quantify the symmetry of contralateral breasts and derive a strong indicator for potentially malignant changes from this metric. Also, we propose a novel metric for the orientation of a finding towards a fix point (the nipple). Our combined scheme then achieves a sensitivity of 89% with a specificity of 78%, matching CAD results for breast MRI on masses. The processing pipeline is intended to run on a CAD server, hence we designed all processing to be automated and free of per-case parameters. We expect that the detection results of our proposed non-mass aimed algorithm will complement other CAD algorithms, or ideally be joined with them in a voting scheme.
A computerized scheme for lung nodule detection in multiprojection chest radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo Wei; Li Qiang; Boyce, Sarah J.
2012-04-15
Purpose: Our previous study indicated that multiprojection chest radiography could significantly improve radiologists' performance for lung nodule detection in clinical practice. In this study, the authors further verify that multiprojection chest radiography can greatly improve the performance of a computer-aided diagnostic (CAD) scheme. Methods: Our database consisted of 59 subjects, including 43 subjects with 45 nodules and 16 subjects without nodules. The 45 nodules included 7 real and 38 simulated ones. The authors developed a conventional CAD scheme and a new fusion CAD scheme to detect lung nodules. The conventional CAD scheme consisted of four steps for (1) identification ofmore » initial nodule candidates inside lungs, (2) nodule candidate segmentation based on dynamic programming, (3) extraction of 33 features from nodule candidates, and (4) false positive reduction using a piecewise linear classifier. The conventional CAD scheme processed each of the three projection images of a subject independently and discarded the correlation information between the three images. The fusion CAD scheme included the four steps in the conventional CAD scheme and two additional steps for (5) registration of all candidates in the three images of a subject, and (6) integration of correlation information between the registered candidates in the three images. The integration step retained all candidates detected at least twice in the three images of a subject and removed those detected only once in the three images as false positives. A leave-one-subject-out testing method was used for evaluation of the performance levels of the two CAD schemes. Results: At the sensitivities of 70%, 65%, and 60%, our conventional CAD scheme reported 14.7, 11.3, and 8.6 false positives per image, respectively, whereas our fusion CAD scheme reported 3.9, 1.9, and 1.2 false positives per image, and 5.5, 2.8, and 1.7 false positives per patient, respectively. The low performance of the conventional CAD scheme may be attributed to the high noise level in chest radiography, and the small size and low contrast of most nodules. Conclusions: This study indicated that the fusion of correlation information in multiprojection chest radiography can markedly improve the performance of CAD scheme for lung nodule detection.« less
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Lu, Xianglan; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Li, Shibo; Liu, Hong; Zheng, Bin
2016-03-01
Automated high throughput scanning microscopy is a fast developing screening technology used in cytogenetic laboratories for the diagnosis of leukemia or other genetic diseases. However, one of the major challenges of using this new technology is how to efficiently detect the analyzable metaphase chromosomes during the scanning process. The purpose of this investigation is to develop a computer aided detection (CAD) scheme based on deep learning technology, which can identify the metaphase chromosomes with high accuracy. The CAD scheme includes an eight layer neural network. The first six layers compose of an automatic feature extraction module, which has an architecture of three convolution-max-pooling layer pairs. The 1st, 2nd and 3rd pair contains 30, 20, 20 feature maps, respectively. The seventh and eighth layers compose of a multiple layer perception (MLP) based classifier, which is used to identify the analyzable metaphase chromosomes. The performance of new CAD scheme was assessed by receiver operation characteristic (ROC) method. A number of 150 regions of interest (ROIs) were selected to test the performance of our new CAD scheme. Each ROI contains either interphase cell or metaphase chromosomes. The results indicate that new scheme is able to achieve an area under the ROC curve (AUC) of 0.886+/-0.043. This investigation demonstrates that applying a deep learning technique may enable to significantly improve the accuracy of the metaphase chromosome detection using a scanning microscopic imaging technology in the future.
A High Order Finite Difference Scheme with Sharp Shock Resolution for the Euler Equations
NASA Technical Reports Server (NTRS)
Gerritsen, Margot; Olsson, Pelle
1996-01-01
We derive a high-order finite difference scheme for the Euler equations that satisfies a semi-discrete energy estimate, and present an efficient strategy for the treatment of discontinuities that leads to sharp shock resolution. The formulation of the semi-discrete energy estimate is based on a symmetrization of the Euler equations that preserves the homogeneity of the flux vector, a canonical splitting of the flux derivative vector, and the use of difference operators that satisfy a discrete analogue to the integration by parts procedure used in the continuous energy estimate. Around discontinuities or sharp gradients, refined grids are created on which the discrete equations are solved after adding a newly constructed artificial viscosity. The positioning of the sub-grids and computation of the viscosity are aided by a detection algorithm which is based on a multi-scale wavelet analysis of the pressure grid function. The wavelet theory provides easy to implement mathematical criteria to detect discontinuities, sharp gradients and spurious oscillations quickly and efficiently.
Automatic rectum limit detection by anatomical markers correlation.
Namías, R; D'Amato, J P; del Fresno, M; Vénere, M
2014-06-01
Several diseases take place at the end of the digestive system. Many of them can be diagnosed by means of different medical imaging modalities together with computer aided detection (CAD) systems. These CAD systems mainly focus on the complete segmentation of the digestive tube. However, the detection of limits between different sections could provide important information to these systems. In this paper we present an automatic method for detecting the rectum and sigmoid colon limit using a novel global curvature analysis over the centerline of the segmented digestive tube in different imaging modalities. The results are compared with the gold standard rectum upper limit through a validation scheme comprising two different anatomical markers: the third sacral vertebra and the average rectum length. Experimental results in both magnetic resonance imaging (MRI) and computed tomography colonography (CTC) acquisitions show the efficacy of the proposed strategy in automatic detection of rectum limits. The method is intended for application to the rectum segmentation in MRI for geometrical modeling and as contextual information source in virtual colonoscopies and CAD systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dictionary learning-based CT detection of pulmonary nodules
NASA Astrophysics Data System (ADS)
Wu, Panpan; Xia, Kewen; Zhang, Yanbo; Qian, Xiaohua; Wang, Ge; Yu, Hengyong
2016-10-01
Segmentation of lung features is one of the most important steps for computer-aided detection (CAD) of pulmonary nodules with computed tomography (CT). However, irregular shapes, complicated anatomical background and poor pulmonary nodule contrast make CAD a very challenging problem. Here, we propose a novel scheme for feature extraction and classification of pulmonary nodules through dictionary learning from training CT images, which does not require accurately segmented pulmonary nodules. Specifically, two classification-oriented dictionaries and one background dictionary are learnt to solve a two-category problem. In terms of the classification-oriented dictionaries, we calculate sparse coefficient matrices to extract intrinsic features for pulmonary nodule classification. The support vector machine (SVM) classifier is then designed to optimize the performance. Our proposed methodology is evaluated with the lung image database consortium and image database resource initiative (LIDC-IDRI) database, and the results demonstrate that the proposed strategy is promising.
NASA Astrophysics Data System (ADS)
Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.
2017-05-01
Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.
Towards a Low-Cost Remote Memory Attestation for the Smart Grid
Yang, Xinyu; He, Xiaofei; Yu, Wei; Lin, Jie; Li, Rui; Yang, Qingyu; Song, Houbing
2015-01-01
In the smart grid, measurement devices may be compromised by adversaries, and their operations could be disrupted by attacks. A number of schemes to efficiently and accurately detect these compromised devices remotely have been proposed. Nonetheless, most of the existing schemes detecting compromised devices depend on the incremental response time in the attestation process, which are sensitive to data transmission delay and lead to high computation and network overhead. To address the issue, in this paper, we propose a low-cost remote memory attestation scheme (LRMA), which can efficiently and accurately detect compromised smart meters considering real-time network delay and achieve low computation and network overhead. In LRMA, the impact of real-time network delay on detecting compromised nodes can be eliminated via investigating the time differences reported from relay nodes. Furthermore, the attestation frequency in LRMA is dynamically adjusted with the compromised probability of each node, and then, the total number of attestations could be reduced while low computation and network overhead can be achieved. Through a combination of extensive theoretical analysis and evaluations, our data demonstrate that our proposed scheme can achieve better detection capacity and lower computation and network overhead in comparison to existing schemes. PMID:26307998
Towards a Low-Cost Remote Memory Attestation for the Smart Grid.
Yang, Xinyu; He, Xiaofei; Yu, Wei; Lin, Jie; Li, Rui; Yang, Qingyu; Song, Houbing
2015-08-21
In the smart grid, measurement devices may be compromised by adversaries, and their operations could be disrupted by attacks. A number of schemes to efficiently and accurately detect these compromised devices remotely have been proposed. Nonetheless, most of the existing schemes detecting compromised devices depend on the incremental response time in the attestation process, which are sensitive to data transmission delay and lead to high computation and network overhead. To address the issue, in this paper, we propose a low-cost remote memory attestation scheme (LRMA), which can efficiently and accurately detect compromised smart meters considering real-time network delay and achieve low computation and network overhead. In LRMA, the impact of real-time network delay on detecting compromised nodes can be eliminated via investigating the time differences reported from relay nodes. Furthermore, the attestation frequency in LRMA is dynamically adjusted with the compromised probability of each node, and then, the total number of attestations could be reduced while low computation and network overhead can be achieved. Through a combination of extensive theoretical analysis and evaluations, our data demonstrate that our proposed scheme can achieve better detection capacity and lower computation and network overhead in comparison to existing schemes.
Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L
2016-02-01
Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.
Deep learning of contrast-coated serrated polyps for computer-aided detection in CT colonography
NASA Astrophysics Data System (ADS)
Näppi, Janne J.; Pickhardt, Perry; Kim, David H.; Hironaka, Toru; Yoshida, Hiroyuki
2017-03-01
Serrated polyps were previously believed to be benign lesions with no cancer potential. However, recent studies have revealed a novel molecular pathway where also serrated polyps can develop into colorectal cancer. CT colonography (CTC) can detect serrated polyps using the radiomic biomarker of contrast coating, but this requires expertise from the reader and current computer-aided detection (CADe) systems have not been designed to detect the contrast coating. The purpose of this study was to develop a novel CADe method that makes use of deep learning to detect serrated polyps based on their contrast-coating biomarker in CTC. In the method, volumetric shape-based features are used to detect polyp sites over soft-tissue and fecal-tagging surfaces of the colon. The detected sites are imaged using multi-angular 2D image patches. A deep convolutional neural network (DCNN) is used to review the image patches for the presence of polyps. The DCNN-based polyp-likelihood estimates are merged into an aggregate likelihood index where highest values indicate the presence of a polyp. For pilot evaluation, the proposed DCNN-CADe method was evaluated with a 10-fold cross-validation scheme using 101 colonoscopy-confirmed cases with 144 biopsy-confirmed serrated polyps from a CTC screening program, where the patients had been prepared for CTC with saline laxative and fecal tagging by barium and iodine-based diatrizoate. The average per-polyp sensitivity for serrated polyps >=6 mm in size was 93+/-7% at 0:8+/-1:8 false positives per patient on average. The detection accuracy was substantially higher that of a conventional CADe system. Our results indicate that serrated polyps can be detected automatically at high accuracy in CTC.
NASA Astrophysics Data System (ADS)
Lartizien, Carole; Marache-Francisco, Simon; Prost, Rémy
2012-02-01
Positron emission tomography (PET) using fluorine-18 deoxyglucose (18F-FDG) has become an increasingly recommended tool in clinical whole-body oncology imaging for the detection, diagnosis, and follow-up of many cancers. One way to improve the diagnostic utility of PET oncology imaging is to assist physicians facing difficult cases of residual or low-contrast lesions. This study aimed at evaluating different schemes of computer-aided detection (CADe) systems for the guided detection and localization of small and low-contrast lesions in PET. These systems are based on two supervised classifiers, linear discriminant analysis (LDA) and the nonlinear support vector machine (SVM). The image feature sets that serve as input data consisted of the coefficients of an undecimated wavelet transform. An optimization study was conducted to select the best combination of parameters for both the SVM and the LDA. Different false-positive reduction (FPR) methods were evaluated to reduce the number of false-positive detections per image (FPI). This includes the removal of small detected clusters and the combination of the LDA and SVM detection maps. The different CAD schemes were trained and evaluated based on a simulated whole-body PET image database containing 250 abnormal cases with 1230 lesions and 250 normal cases with no lesion. The detection performance was measured on a separate series of 25 testing images with 131 lesions. The combination of the LDA and SVM score maps was shown to produce very encouraging detection performance for both the lung lesions, with 91% sensitivity and 18 FPIs, and the liver lesions, with 94% sensitivity and 10 FPIs. Comparison with human performance indicated that the different CAD schemes significantly outperformed human detection sensitivities, especially regarding the low-contrast lesions.
Development of a fully automatic scheme for detection of masses in whole breast ultrasound images.
Ikedo, Yuji; Fukuoka, Daisuke; Hara, Takeshi; Fujita, Hiroshi; Takada, Etsuo; Endo, Tokiko; Morita, Takako
2007-11-01
Ultrasonography has been used for breast cancer screening in Japan. Screening using a conventional hand-held probe is operator dependent and thus it is possible that some areas of the breast may not be scanned. To overcome such problems, a mechanical whole breast ultrasound (US) scanner has been proposed and developed for screening purposes. However, another issue is that radiologists might tire while interpreting all images in a large-volume screening; this increases the likelihood that masses may remain undetected. Therefore, the aim of this study is to develop a fully automatic scheme for the detection of masses in whole breast US images in order to assist the interpretations of radiologists and potentially improve the screening accuracy. The authors database comprised 109 whole breast US imagoes, which include 36 masses (16 malignant masses, 5 fibroadenomas, and 15 cysts). A whole breast US image with 84 slice images (interval between two slice images: 2 mm) was obtained by the ASU-1004 US scanner (ALOKA Co., Ltd., Japan). The feature based on the edge directions in each slice and a method for subtracting between the slice images were used for the detection of masses in the authors proposed scheme. The Canny edge detector was applied to detect edges in US images; these edges were classified as near-vertical edges or near-horizontal edges using a morphological method. The positions of mass candidates were located using the near-vertical edges as a cue. Then, the located positions were segmented by the watershed algorithm and mass candidate regions were detected using the segmented regions and the low-density regions extracted by the slice subtraction method. For the removal of false positives (FPs), rule-based schemes and a quadratic discriminant analysis were applied for the distribution between masses and FPs. As a result, the sensitivity of the authors scheme for the detection of masses was 80.6% (29/36) with 3.8 FPs per whole breast image. The authors scheme for a computer-aided detection may be useful in improving the screening performance and efficiency.
MIMO transmit scheme based on morphological perceptron with competitive learning.
Valente, Raul Ambrozio; Abrão, Taufik
2016-08-01
This paper proposes a new multi-input multi-output (MIMO) transmit scheme aided by artificial neural network (ANN). The morphological perceptron with competitive learning (MP/CL) concept is deployed as a decision rule in the MIMO detection stage. The proposed MIMO transmission scheme is able to achieve double spectral efficiency; hence, in each time-slot the receiver decodes two symbols at a time instead one as Alamouti scheme. Other advantage of the proposed transmit scheme with MP/CL-aided detector is its polynomial complexity according to modulation order, while it becomes linear when the data stream length is greater than modulation order. The performance of the proposed scheme is compared to the traditional MIMO schemes, namely Alamouti scheme and maximum-likelihood MIMO (ML-MIMO) detector. Also, the proposed scheme is evaluated in a scenario with variable channel information along the frame. Numerical results have shown that the diversity gain under space-time coding Alamouti scheme is partially lost, which slightly reduces the bit-error rate (BER) performance of the proposed MP/CL-NN MIMO scheme. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Park, Sang Cheol; Zheng, Bin; Wang, Xiao-Hui; Gur, David
2008-03-01
Digital breast tomosynthesis (DBT) has emerged as a promising imaging modality for screening mammography. However, visually detecting micro-calcification clusters depicted on DBT images is a difficult task. Computer-aided detection (CAD) schemes for detecting micro-calcification clusters depicted on mammograms can achieve high performance and the use of CAD results can assist radiologists in detecting subtle micro-calcification clusters. In this study, we compared the performance of an available 2D based CAD scheme with one that includes a new grouping and scoring method when applied to both projection and reconstructed DBT images. We selected a dataset involving 96 DBT examinations acquired on 45 women. Each DBT image set included 11 low dose projection images and a varying number of reconstructed image slices ranging from 18 to 87. In this dataset 20 true-positive micro-calcification clusters were visually detected on the projection images and 40 were visually detected on the reconstructed images, respectively. We first applied the CAD scheme that was previously developed in our laboratory to the DBT dataset. We then tested a new grouping method that defines an independent cluster by grouping the same cluster detected on different projection or reconstructed images. We then compared four scoring methods to assess the CAD performance. The maximum sensitivity level observed for the different grouping and scoring methods were 70% and 88% for the projection and reconstructed images with a maximum false-positive rate of 4.0 and 15.9 per examination, respectively. This preliminary study demonstrates that (1) among the maximum, the minimum or the average CAD generated scores, using the maximum score of the grouped cluster regions achieved the highest performance level, (2) the histogram based scoring method is reasonably effective in reducing false-positive detections on the projection images but the overall CAD sensitivity is lower due to lower signal-to-noise ratio, and (3) CAD achieved higher sensitivity and higher false-positive rate (per examination) on the reconstructed images. We concluded that without changing the detection threshold or performing pre-filtering to possibly increase detection sensitivity, current CAD schemes developed and optimized for 2D mammograms perform relatively poorly and need to be re-optimized using DBT datasets and new grouping and scoring methods need to be incorporated into the schemes if these are to be used on the DBT examinations.
NASA Astrophysics Data System (ADS)
Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Qiu, Yuchen; Zheng, Bin
2018-02-01
Objective of this study is to develop and test a new computer-aided detection (CAD) scheme with improved region of interest (ROI) segmentation combined with an image feature extraction framework to improve performance in predicting short-term breast cancer risk. A dataset involving 570 sets of "prior" negative mammography screening cases was retrospectively assembled. In the next sequential "current" screening, 285 cases were positive and 285 cases remained negative. A CAD scheme was applied to all 570 "prior" negative images to stratify cases into the high and low risk case group of having cancer detected in the "current" screening. First, a new ROI segmentation algorithm was used to automatically remove useless area of mammograms. Second, from the matched bilateral craniocaudal view images, a set of 43 image features related to frequency characteristics of ROIs were initially computed from the discrete cosine transform and spatial domain of the images. Third, a support vector machine model based machine learning classifier was used to optimally classify the selected optimal image features to build a CAD-based risk prediction model. The classifier was trained using a leave-one-case-out based cross-validation method. Applying this improved CAD scheme to the testing dataset, an area under ROC curve, AUC = 0.70+/-0.04, which was significantly higher than using the extracting features directly from the dataset without the improved ROI segmentation step (AUC = 0.63+/-0.04). This study demonstrated that the proposed approach could improve accuracy on predicting short-term breast cancer risk, which may play an important role in helping eventually establish an optimal personalized breast cancer paradigm.
A hybrid deep learning approach to predict malignancy of breast lesions using mammograms
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Heidari, Morteza; Mirniaharikandehei, Seyedehnafiseh; Gong, Jing; Qian, Wei; Qiu, Yuchen; Zheng, Bin
2018-03-01
Applying deep learning technology to medical imaging informatics field has been recently attracting extensive research interest. However, the limited medical image dataset size often reduces performance and robustness of the deep learning based computer-aided detection and/or diagnosis (CAD) schemes. In attempt to address this technical challenge, this study aims to develop and evaluate a new hybrid deep learning based CAD approach to predict likelihood of a breast lesion detected on mammogram being malignant. In this approach, a deep Convolutional Neural Network (CNN) was firstly pre-trained using the ImageNet dataset and serve as a feature extractor. A pseudo-color Region of Interest (ROI) method was used to generate ROIs with RGB channels from the mammographic images as the input to the pre-trained deep network. The transferred CNN features from different layers of the CNN were then obtained and a linear support vector machine (SVM) was trained for the prediction task. By applying to a dataset involving 301 suspicious breast lesions and using a leave-one-case-out validation method, the areas under the ROC curves (AUC) = 0.762 and 0.792 using the traditional CAD scheme and the proposed deep learning based CAD scheme, respectively. An ensemble classifier that combines the classification scores generated by the two schemes yielded an improved AUC value of 0.813. The study results demonstrated feasibility and potentially improved performance of applying a new hybrid deep learning approach to develop CAD scheme using a relatively small dataset of medical images.
Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1980-01-01
There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Lu, Xianglan; Tan, Maxine; Li, Shibo; Liu, Hong; Zheng, Bin
2015-03-01
The purpose of this study is to investigate the feasibility of applying automatic interphase FISH cells analysis method for detecting the residual malignancy of post chemotherapy leukemia patients. In the experiment, two clinical specimens with translocation between chromosome No. 9 and 22 or No. 11 and 14 were selected from the patients underwent leukemia diagnosis and treatment. The entire slide of each specimen was first digitalized by a commercial fluorescent microscope using a 40× objective lens. Then, the scanned images were processed by a computer-aided detecting (CAD) scheme to identify the analyzable FISH cells, which is accomplished by applying a series of features including the region size, Brenner gradient and maximum intensity. For each identified cell, the scheme detected and counted the number of the FISH signal dots inside the nucleus, using the adaptive threshold of the region size and distance of the labeled FISH dots. The results showed that the new CAD scheme detected 8093 and 6675 suspicious regions of interest (ROI) in two specimens, among which 4546 and 3807 ROI contain analyzable interphase FISH cell. In these analyzable ROIs, CAD selected 334 and 405 residual malignant cancer cells, which is substantially more than those visually detected in a cytogenetic laboratory of our medical center (334 vs. 122, 405 vs. 160). This investigation indicates that an automatic interphase FISH cell scanning and CAD method has the potential to improve the accuracy and efficiency of the prognostic assessment for leukemia and other genetic related cancer patients in the future.
Decision-aided ICI mitigation with time-domain average approximation in CO-OFDM
NASA Astrophysics Data System (ADS)
Ren, Hongliang; Cai, Jiaxing; Ye, Xin; Lu, Jin; Cao, Quanjun; Guo, Shuqin; Xue, Lin-lin; Qin, Yali; Hu, Weisheng
2015-07-01
We introduce and investigate the feasibility of a novel iterative blind phase noise inter-carrier interference (ICI) mitigation scheme for coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. The ICI mitigation scheme is performed through the combination of frequency-domain symbol decision-aided estimation and the ICI phase noise time-average approximation. An additional initial decision process with suitable threshold is introduced in order to suppress the decision error symbols. Our proposed ICI mitigation scheme is proved to be effective in removing the ICI for a simulated CO-OFDM with 16-QAM modulation format. With the slightly high computational complexity, it outperforms the time-domain average blind ICI (Avg-BL-ICI) algorithm at a relatively wide laser line-width and high OSNR.
Liu, Jiamin; Kabadi, Suraj; Van Uitert, Robert; Petrick, Nicholas; Deriche, Rachid; Summers, Ronald M.
2011-01-01
Purpose: Surface curvatures are important geometric features for the computer-aided analysis and detection of polyps in CT colonography (CTC). However, the general kernel approach for curvature computation can yield erroneous results for small polyps and for polyps that lie on haustral folds. Those erroneous curvatures will reduce the performance of polyp detection. This paper presents an analysis of interpolation’s effect on curvature estimation for thin structures and its application on computer-aided detection of small polyps in CTC. Methods: The authors demonstrated that a simple technique, image interpolation, can improve the accuracy of curvature estimation for thin structures and thus significantly improve the sensitivity of small polyp detection in CTC. Results: Our experiments showed that the merits of interpolating included more accurate curvature values for simulated data, and isolation of polyps near folds for clinical data. After testing on a large clinical data set, it was observed that sensitivities with linear, quadratic B-spline and cubic B-spline interpolations significantly improved the sensitivity for small polyp detection. Conclusions: The image interpolation can improve the accuracy of curvature estimation for thin structures and thus improve the computer-aided detection of small polyps in CTC. PMID:21859029
A novel computer-aided detection system for pulmonary nodule identification in CT images
NASA Astrophysics Data System (ADS)
Han, Hao; Li, Lihong; Wang, Huafeng; Zhang, Hao; Moore, William; Liang, Zhengrong
2014-03-01
Computer-aided detection (CADe) of pulmonary nodules from computer tomography (CT) scans is critical for assisting radiologists to identify lung lesions at an early stage. In this paper, we propose a novel approach for CADe of lung nodules using a two-stage vector quantization (VQ) scheme. The first-stage VQ aims to extract lung from the chest volume, while the second-stage VQ is designed to extract initial nodule candidates (INCs) within the lung volume. Then rule-based expert filtering is employed to prune obvious FPs from INCs, and the commonly-used support vector machine (SVM) classifier is adopted to further reduce the FPs. The proposed system was validated on 100 CT scans randomly selected from the 262 scans that have at least one juxta-pleural nodule annotation in the publicly available database - Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). The two-stage VQ only missed 2 out of the 207 nodules at agreement level 1, and the INCs detection for each scan took about 30 seconds in average. Expert filtering reduced FPs more than 18 times, while maintaining a sensitivity of 93.24%. As it is trivial to distinguish INCs attached to pleural wall versus not on wall, we investigated the feasibility of training different SVM classifiers to further reduce FPs from these two kinds of INCs. Experiment results indicated that SVM classification over the entire set of INCs was in favor of, where the optimal operating of our CADe system achieved a sensitivity of 89.4% at a specificity of 86.8%.
NASA Technical Reports Server (NTRS)
Mathur, F. P.
1972-01-01
Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.
The Use of Computers to Aid the Teaching of Creative Writing.
ERIC Educational Resources Information Center
Sharples, Mike
1983-01-01
An analysis of the writing process is followed by a description of programs used in a computer-based creative writing scheme developed at Edinburgh University. An account of a project to study the program's effect on the creative writings of 11 year old pupils concludes the article. (EAO)
Chen, Sheng; Yao, Liping; Chen, Bao
2016-11-01
The enhancement of lung nodules in chest radiographs (CXRs) plays an important role in the manual as well as computer-aided detection (CADe) lung cancer. In this paper, we proposed a parameterized logarithmic image processing (PLIP) method combined with the Laplacian of a Gaussian (LoG) filter to enhance lung nodules in CXRs. We first applied several LoG filters with varying parameters to an original CXR to enhance the nodule-like structures as well as the edges in the image. We then applied the PLIP model, which can enhance lung nodule images with high contrast and was beneficial in extracting effective features for nodule detection in the CADe scheme. Our method combined the advantages of both the PLIP algorithm and the LoG algorithm, which can enhance lung nodules in chest radiographs with high contrast. To test our nodule enhancement method, we tested a CADe scheme, with a relatively high performance in nodule detection, using a publically available database containing 140 nodules in 140 CXRs enhanced through our nodule enhancement method. The CADe scheme attained a sensitivity of 81 and 70 % with an average of 5.0 frame rate (FP) and 2.0 FP, respectively, in a leave-one-out cross-validation test. By contrast, the CADe scheme based on the original image recorded a sensitivity of 77 and 63 % at 5.0 FP and 2.0 FP, respectively. We introduced the measurement of enhancement by entropy evaluation to objectively assess our method. Experimental results show that the proposed method obtains an effective enhancement of lung nodules in CXRs for both radiologists and CADe schemes.
Statistical process control based chart for information systems security
NASA Astrophysics Data System (ADS)
Khan, Mansoor S.; Cui, Lirong
2015-07-01
Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.
Machine Learning in Ultrasound Computer-Aided Diagnostic Systems: A Survey
Zhang, Fan; Li, Xuelong
2018-01-01
The ultrasound imaging is one of the most common schemes to detect diseases in the clinical practice. There are many advantages of ultrasound imaging such as safety, convenience, and low cost. However, reading ultrasound imaging is not easy. To support the diagnosis of clinicians and reduce the load of doctors, many ultrasound computer-aided diagnosis (CAD) systems are proposed. In recent years, the success of deep learning in the image classification and segmentation led to more and more scholars realizing the potential of performance improvement brought by utilizing the deep learning in the ultrasound CAD system. This paper summarized the research which focuses on the ultrasound CAD system utilizing machine learning technology in recent years. This study divided the ultrasound CAD system into two categories. One is the traditional ultrasound CAD system which employed the manmade feature and the other is the deep learning ultrasound CAD system. The major feature and the classifier employed by the traditional ultrasound CAD system are introduced. As for the deep learning ultrasound CAD, newest applications are summarized. This paper will be useful for researchers who focus on the ultrasound CAD system. PMID:29687000
Machine Learning in Ultrasound Computer-Aided Diagnostic Systems: A Survey.
Huang, Qinghua; Zhang, Fan; Li, Xuelong
2018-01-01
The ultrasound imaging is one of the most common schemes to detect diseases in the clinical practice. There are many advantages of ultrasound imaging such as safety, convenience, and low cost. However, reading ultrasound imaging is not easy. To support the diagnosis of clinicians and reduce the load of doctors, many ultrasound computer-aided diagnosis (CAD) systems are proposed. In recent years, the success of deep learning in the image classification and segmentation led to more and more scholars realizing the potential of performance improvement brought by utilizing the deep learning in the ultrasound CAD system. This paper summarized the research which focuses on the ultrasound CAD system utilizing machine learning technology in recent years. This study divided the ultrasound CAD system into two categories. One is the traditional ultrasound CAD system which employed the manmade feature and the other is the deep learning ultrasound CAD system. The major feature and the classifier employed by the traditional ultrasound CAD system are introduced. As for the deep learning ultrasound CAD, newest applications are summarized. This paper will be useful for researchers who focus on the ultrasound CAD system.
Applying a CAD-generated imaging marker to assess short-term breast cancer risk
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Zarafshani, Ali; Heidari, Morteza; Wang, Yunzhi; Aghaei, Faranak; Zheng, Bin
2018-02-01
Although whether using computer-aided detection (CAD) helps improve radiologists' performance in reading and interpreting mammograms is controversy due to higher false-positive detection rates, objective of this study is to investigate and test a new hypothesis that CAD-generated false-positives, in particular, the bilateral summation of false-positives, is a potential imaging marker associated with short-term breast cancer risk. An image dataset involving negative screening mammograms acquired from 1,044 women was retrospectively assembled. Each case involves 4 images of craniocaudal (CC) and mediolateral oblique (MLO) view of the left and right breasts. In the next subsequent mammography screening, 402 cases were positive for cancer detected and 642 remained negative. A CAD scheme was applied to process all "prior" negative mammograms. Some features from CAD scheme were extracted, which include detection seeds, the total number of false-positive regions, an average of detection scores and the sum of detection scores in CC and MLO view images. Then the features computed from two bilateral images of left and right breasts from either CC or MLO view were combined. In order to predict the likelihood of each testing case being positive in the next subsequent screening, two logistic regression models were trained and tested using a leave-one-case-out based cross-validation method. Data analysis demonstrated the maximum prediction accuracy with an area under a ROC curve of AUC=0.65+/-0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of [2.95, 6.83]. The results also illustrated an increasing trend in the adjusted odds ratio and risk prediction scores (p<0.01). Thus, the study showed that CAD-generated false-positives might provide a new quantitative imaging marker to help assess short-term breast cancer risk.
A New Approach for Constructing Highly Stable High Order CESE Schemes
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung
2010-01-01
A new approach is devised to construct high order CESE schemes which would avoid the common shortcomings of traditional high order schemes including: (a) susceptibility to computational instabilities; (b) computational inefficiency due to their local implicit nature (i.e., at each mesh points, need to solve a system of linear/nonlinear equations involving all the mesh variables associated with this mesh point); (c) use of large and elaborate stencils which complicates boundary treatments and also makes efficient parallel computing much harder; (d) difficulties in applications involving complex geometries; and (e) use of problem-specific techniques which are needed to overcome stability problems but often cause undesirable side effects. In fact it will be shown that, with the aid of a conceptual leap, one can build from a given 2nd-order CESE scheme its 4th-, 6th-, 8th-,... order versions which have the same stencil and same stability conditions of the 2nd-order scheme, and also retain all other advantages of the latter scheme. A sketch of multidimensional extensions will also be provided.
Computing Support for Basic Research in Perception and Cognition
1988-12-07
hearing aids and cochlear implants, this suggests that certain types of proposed coding schemes, specifically those employing periodicity tuning in...developing a computer model of the interaction of declarative and procedural knowledge in skill acquisition. In the Visual Psychophysics Laboratory... Psycholinguistics - Laboratory a computer model of text comprehension and recall has been constructed and several - experiments have been completed that verify basic
Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen
2015-01-01
Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558
Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen
2015-01-01
Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.
[Detection of lung nodules. New opportunities in chest radiography].
Pötter-Lang, S; Schalekamp, S; Schaefer-Prokop, C; Uffmann, M
2014-05-01
Chest radiography still represents the most commonly performed X-ray examination because it is readily available, requires low radiation doses and is relatively inexpensive. However, as previously published, many initially undetected lung nodules are retrospectively visible in chest radiographs. The great improvements in detector technology with the increasing dose efficiency and improved contrast resolution provide a better image quality and reduced dose needs. The dual energy acquisition technique and advanced image processing methods (e.g. digital bone subtraction and temporal subtraction) reduce the anatomical background noise by reduction of overlapping structures in chest radiography. Computer-aided detection (CAD) schemes increase the awareness of radiologists for suspicious areas. The advanced image processing methods show clear improvements for the detection of pulmonary lung nodules in chest radiography and strengthen the role of this method in comparison to 3D acquisition techniques, such as computed tomography (CT). Many of these methods will probably be integrated into standard clinical treatment in the near future. Digital software solutions offer advantages as they can be easily incorporated into radiology departments and are often more affordable as compared to hardware solutions.
NASA Astrophysics Data System (ADS)
Jenuwine, Natalia M.; Mahesh, Sunny N.; Furst, Jacob D.; Raicu, Daniela S.
2018-02-01
Early detection of lung nodules from CT scans is key to improving lung cancer treatment, but poses a significant challenge for radiologists due to the high throughput required of them. Computer-Aided Detection (CADe) systems aim to automatically detect these nodules with computer algorithms, thus improving diagnosis. These systems typically use a candidate selection step, which identifies all objects that resemble nodules, followed by a machine learning classifier which separates true nodules from false positives. We create a CADe system that uses a 3D convolutional neural network (CNN) to detect nodules in CT scans without a candidate selection step. Using data from the LIDC database, we train a 3D CNN to analyze subvolumes from anywhere within a CT scan and output the probability that each subvolume contains a nodule. Once trained, we apply our CNN to detect nodules from entire scans, by systematically dividing the scan into overlapping subvolumes which we input into the CNN to obtain the corresponding probabilities. By enabling our network to process an entire scan, we expect to streamline the detection process while maintaining its effectiveness. Our results imply that with continued training using an iterative training scheme, the one-step approach has the potential to be highly effective.
An Interaction of Screen Colour and Lesson Task in CAL
ERIC Educational Resources Information Center
Clariana, Roy B.
2004-01-01
Colour is a common feature in computer-aided learning (CAL), though the instructional effects of screen colour are not well understood. This investigation considers the effects of different CAL study tasks with feedback on posttest performance and on posttest memory of the lesson colour scheme. Graduate students (n=68) completed a computer-based…
Xu, X W; Doi, K; Kobayashi, T; MacMahon, H; Giger, M L
1997-09-01
Lung cancer is the leading cause of cancer deaths in men and women in the United States, with a 5-year survival rate of only about 13%. However, this survival rate can be improved to 47% if the disease is diagnosed and treated at an early stage. In this study, we developed an improved computer-aided diagnosis (CAD) scheme for the automated detection of lung nodules in digital chest images to assist radiologists, who could miss up to 30% of the actually positive cases in their daily practice. Two hundred PA chest radiographs, 100 normals and 100 abnormals, were used as the database for our study. The presence of nodules in the 100 abnormal cases was confirmed by two experienced radiologists on the basis of CT scans or radiographic follow-up. In our CAD scheme, nodule candidates were selected initially by multiple gray-level thresholding of the difference image (which corresponds to the subtraction of a signal-enhanced image and a signal-suppressed image) and then classified into six groups. A large number of false positives were eliminated by adaptive rule-based tests and an artificial neural network (ANN). The CAD scheme achieved, on average, a sensitivity of 70% with 1.7 false positives per chest image, a performance which was substantially better as compared with other studies. The CPU time for the processing of one chest image was about 20 seconds on an IBM RISC/6000 Powerstation 590. We believe that the CAD scheme with the current performance is ready for initial clinical evaluation.
NASA Astrophysics Data System (ADS)
Kim, Do-Bin; Kwon, Dae Woong; Kim, Seunghyun; Lee, Sang-Ho; Park, Byung-Gook
2018-02-01
To obtain high channel boosting potential and reduce a program disturbance in channel stacked NAND flash memory with layer selection by multilevel (LSM) operation, a new program scheme using boosted common source line (CSL) is proposed. The proposed scheme can be achieved by applying proper bias to each layer through its own CSL. Technology computer-aided design (TCAD) simulations are performed to verify the validity of the new method in LSM. Through TCAD simulation, it is revealed that the program disturbance characteristics is effectively improved by the proposed scheme.
Enabling an Integrated Rate-temporal Learning Scheme on Memristor
NASA Astrophysics Data System (ADS)
He, Wei; Huang, Kejie; Ning, Ning; Ramanathan, Kiruthika; Li, Guoqi; Jiang, Yu; Sze, Jiayin; Shi, Luping; Zhao, Rong; Pei, Jing
2014-04-01
Learning scheme is the key to the utilization of spike-based computation and the emulation of neural/synaptic behaviors toward realization of cognition. The biological observations reveal an integrated spike time- and spike rate-dependent plasticity as a function of presynaptic firing frequency. However, this integrated rate-temporal learning scheme has not been realized on any nano devices. In this paper, such scheme is successfully demonstrated on a memristor. Great robustness against the spiking rate fluctuation is achieved by waveform engineering with the aid of good analog properties exhibited by the iron oxide-based memristor. The spike-time-dependence plasticity (STDP) occurs at moderate presynaptic firing frequencies and spike-rate-dependence plasticity (SRDP) dominates other regions. This demonstration provides a novel approach in neural coding implementation, which facilitates the development of bio-inspired computing systems.
Computer Aided Detection of Breast Masses in Digital Tomosynthesis
2008-06-01
the suspicious CAD location were extracted. For the second set, 256x256 ROIs representing the - 8 - summed slab of 5 slices (5 mm) were extracted...region hotelling observer, digital tomosynthesis, multi-slice CAD algorithms, biopsy 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18...developing computer-aided detection ( CAD ) tools for mammography. Although these tools have shown promise in identifying calcifications, detecting
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin
2015-03-01
The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.
Semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2017-06-01
This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...
NASA Technical Reports Server (NTRS)
Chuang, C.-H.; Goodson, Troy D.; Ledsinger, Laura A.
1995-01-01
This report describes current work in the numerical computation of multiple burn, fuel-optimal orbit transfers and presents an analysis of the second variation for extremal multiple burn orbital transfers as well as a discussion of a guidance scheme which may be implemented for such transfers. The discussion of numerical computation focuses on the use of multivariate interpolation to aid the computation in the numerical optimization. The second variation analysis includes the development of the conditions for the examination of both fixed and free final time transfers. Evaluations for fixed final time are presented for extremal one, two, and three burn solutions of the first variation. The free final time problem is considered for an extremal two burn solution. In addition, corresponding changes of the second variation formulation over thrust arcs and coast arcs are included. The guidance scheme discussed is an implicit scheme which implements a neighboring optimal feedback guidance strategy to calculate both thrust direction and thrust on-off times.
NASA Astrophysics Data System (ADS)
Danala, Gopichandh; Wang, Yunzhi; Thai, Theresa; Gunderson, Camille C.; Moxley, Katherine M.; Moore, Kathleen; Mannel, Robert S.; Cheng, Samuel; Liu, Hong; Zheng, Bin; Qiu, Yuchen
2017-02-01
Accurate tumor segmentation is a critical step in the development of the computer-aided detection (CAD) based quantitative image analysis scheme for early stage prognostic evaluation of ovarian cancer patients. The purpose of this investigation is to assess the efficacy of several different methods to segment the metastatic tumors occurred in different organs of ovarian cancer patients. In this study, we developed a segmentation scheme consisting of eight different algorithms, which can be divided into three groups: 1) Region growth based methods; 2) Canny operator based methods; and 3) Partial differential equation (PDE) based methods. A number of 138 tumors acquired from 30 ovarian cancer patients were used to test the performance of these eight segmentation algorithms. The results demonstrate each of the tested tumors can be successfully segmented by at least one of the eight algorithms without the manual boundary correction. Furthermore, modified region growth, classical Canny detector, and fast marching, and threshold level set algorithms are suggested in the future development of the ovarian cancer related CAD schemes. This study may provide meaningful reference for developing novel quantitative image feature analysis scheme to more accurately predict the response of ovarian cancer patients to the chemotherapy at early stage.
Locally adaptive decision in detection of clustered microcalcifications in mammograms.
Sainz de Cea, María V; Nishikawa, Robert M; Yang, Yongyi
2018-02-15
In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value <10 -4 ). There was also a reduction in case-to-case variability in detected FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.
Locally adaptive decision in detection of clustered microcalcifications in mammograms
NASA Astrophysics Data System (ADS)
Sainz de Cea, María V.; Nishikawa, Robert M.; Yang, Yongyi
2018-02-01
In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value <10-4). There was also a reduction in case-to-case variability in detected FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.
Evaluation of computer-aided detection and diagnosis systems.
Petrick, Nicholas; Sahiner, Berkman; Armato, Samuel G; Bert, Alberto; Correale, Loredana; Delsanto, Silvia; Freedman, Matthew T; Fryd, David; Gur, David; Hadjiiski, Lubomir; Huo, Zhimin; Jiang, Yulei; Morra, Lia; Paquerault, Sophie; Raykar, Vikas; Samuelson, Frank; Summers, Ronald M; Tourassi, Georgia; Yoshida, Hiroyuki; Zheng, Bin; Zhou, Chuan; Chan, Heang-Ping
2013-08-01
Computer-aided detection and diagnosis (CAD) systems are increasingly being used as an aid by clinicians for detection and interpretation of diseases. Computer-aided detection systems mark regions of an image that may reveal specific abnormalities and are used to alert clinicians to these regions during image interpretation. Computer-aided diagnosis systems provide an assessment of a disease using image-based information alone or in combination with other relevant diagnostic data and are used by clinicians as a decision support in developing their diagnoses. While CAD systems are commercially available, standardized approaches for evaluating and reporting their performance have not yet been fully formalized in the literature or in a standardization effort. This deficiency has led to difficulty in the comparison of CAD devices and in understanding how the reported performance might translate into clinical practice. To address these important issues, the American Association of Physicists in Medicine (AAPM) formed the Computer Aided Detection in Diagnostic Imaging Subcommittee (CADSC), in part, to develop recommendations on approaches for assessing CAD system performance. The purpose of this paper is to convey the opinions of the AAPM CADSC members and to stimulate the development of consensus approaches and "best practices" for evaluating CAD systems. Both the assessment of a standalone CAD system and the evaluation of the impact of CAD on end-users are discussed. It is hoped that awareness of these important evaluation elements and the CADSC recommendations will lead to further development of structured guidelines for CAD performance assessment. Proper assessment of CAD system performance is expected to increase the understanding of a CAD system's effectiveness and limitations, which is expected to stimulate further research and development efforts on CAD technologies, reduce problems due to improper use, and eventually improve the utility and efficacy of CAD in clinical practice.
Evaluation of computer-aided detection and diagnosis systemsa)
Petrick, Nicholas; Sahiner, Berkman; Armato, Samuel G.; Bert, Alberto; Correale, Loredana; Delsanto, Silvia; Freedman, Matthew T.; Fryd, David; Gur, David; Hadjiiski, Lubomir; Huo, Zhimin; Jiang, Yulei; Morra, Lia; Paquerault, Sophie; Raykar, Vikas; Samuelson, Frank; Summers, Ronald M.; Tourassi, Georgia; Yoshida, Hiroyuki; Zheng, Bin; Zhou, Chuan; Chan, Heang-Ping
2013-01-01
Computer-aided detection and diagnosis (CAD) systems are increasingly being used as an aid by clinicians for detection and interpretation of diseases. Computer-aided detection systems mark regions of an image that may reveal specific abnormalities and are used to alert clinicians to these regions during image interpretation. Computer-aided diagnosis systems provide an assessment of a disease using image-based information alone or in combination with other relevant diagnostic data and are used by clinicians as a decision support in developing their diagnoses. While CAD systems are commercially available, standardized approaches for evaluating and reporting their performance have not yet been fully formalized in the literature or in a standardization effort. This deficiency has led to difficulty in the comparison of CAD devices and in understanding how the reported performance might translate into clinical practice. To address these important issues, the American Association of Physicists in Medicine (AAPM) formed the Computer Aided Detection in Diagnostic Imaging Subcommittee (CADSC), in part, to develop recommendations on approaches for assessing CAD system performance. The purpose of this paper is to convey the opinions of the AAPM CADSC members and to stimulate the development of consensus approaches and “best practices” for evaluating CAD systems. Both the assessment of a standalone CAD system and the evaluation of the impact of CAD on end-users are discussed. It is hoped that awareness of these important evaluation elements and the CADSC recommendations will lead to further development of structured guidelines for CAD performance assessment. Proper assessment of CAD system performance is expected to increase the understanding of a CAD system's effectiveness and limitations, which is expected to stimulate further research and development efforts on CAD technologies, reduce problems due to improper use, and eventually improve the utility and efficacy of CAD in clinical practice. PMID:23927365
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teramoto, Atsushi, E-mail: teramoto@fujita-hu.ac.jp; Fujita, Hiroshi; Yamamuro, Osamu
Purpose: Automated detection of solitary pulmonary nodules using positron emission tomography (PET) and computed tomography (CT) images shows good sensitivity; however, it is difficult to detect nodules in contact with normal organs, and additional efforts are needed so that the number of false positives (FPs) can be further reduced. In this paper, the authors propose an improved FP-reduction method for the detection of pulmonary nodules in PET/CT images by means of convolutional neural networks (CNNs). Methods: The overall scheme detects pulmonary nodules using both CT and PET images. In the CT images, a massive region is first detected using anmore » active contour filter, which is a type of contrast enhancement filter that has a deformable kernel shape. Subsequently, high-uptake regions detected by the PET images are merged with the regions detected by the CT images. FP candidates are eliminated using an ensemble method; it consists of two feature extractions, one by shape/metabolic feature analysis and the other by a CNN, followed by a two-step classifier, one step being rule based and the other being based on support vector machines. Results: The authors evaluated the detection performance using 104 PET/CT images collected by a cancer-screening program. The sensitivity in detecting candidates at an initial stage was 97.2%, with 72.8 FPs/case. After performing the proposed FP-reduction method, the sensitivity of detection was 90.1%, with 4.9 FPs/case; the proposed method eliminated approximately half the FPs existing in the previous study. Conclusions: An improved FP-reduction scheme using CNN technique has been developed for the detection of pulmonary nodules in PET/CT images. The authors’ ensemble FP-reduction method eliminated 93% of the FPs; their proposed method using CNN technique eliminates approximately half the FPs existing in the previous study. These results indicate that their method may be useful in the computer-aided detection of pulmonary nodules using PET/CT images.« less
Computer-aided diagnosis of pulmonary diseases using x-ray darkfield radiography
NASA Astrophysics Data System (ADS)
Einarsdóttir, Hildur; Yaroshenko, Andre; Velroyen, Astrid; Bech, Martin; Hellbach, Katharina; Auweter, Sigrid; Yildirim, Önder; Meinel, Felix G.; Eickelberg, Oliver; Reiser, Maximilian; Larsen, Rasmus; Kjær Ersbøll, Bjarne; Pfeiffer, Franz
2015-12-01
In this work we develop a computer-aided diagnosis (CAD) scheme for classification of pulmonary disease for grating-based x-ray radiography. In addition to conventional transmission radiography, the grating-based technique provides a dark-field imaging modality, which utilizes the scattering properties of the x-rays. This modality has shown great potential for diagnosing early stage emphysema and fibrosis in mouse lungs in vivo. The CAD scheme is developed to assist radiologists and other medical experts to develop new diagnostic methods when evaluating grating-based images. The scheme consists of three stages: (i) automatic lung segmentation; (ii) feature extraction from lung shape and dark-field image intensities; (iii) classification between healthy, emphysema and fibrosis lungs. A study of 102 mice was conducted with 34 healthy, 52 emphysema and 16 fibrosis subjects. Each image was manually annotated to build an experimental dataset. System performance was assessed by: (i) determining the quality of the segmentations; (ii) validating emphysema and fibrosis recognition by a linear support vector machine using leave-one-out cross-validation. In terms of segmentation quality, we obtained an overlap percentage (Ω) 92.63 ± 3.65%, Dice Similarity Coefficient (DSC) 89.74 ± 8.84% and Jaccard Similarity Coefficient 82.39 ± 12.62%. For classification, the accuracy, sensitivity and specificity of diseased lung recognition was 100%. Classification between emphysema and fibrosis resulted in an accuracy of 93%, whilst the sensitivity was 94% and specificity 88%. In addition to the automatic classification of lungs, deviation maps created by the CAD scheme provide a visual aid for medical experts to further assess the severity of pulmonary disease in the lung, and highlights regions affected.
Thermal radiation view factor: Methods, accuracy and computer-aided procedures
NASA Technical Reports Server (NTRS)
Kadaba, P. V.
1982-01-01
The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.
Optimizing Scheme for Remote Preparation of Four-particle Cluster-like Entangled States
NASA Astrophysics Data System (ADS)
Wang, Dong; Ye, Liu
2011-09-01
Recently, Ma et al. (Opt. Commun. 283:2640, 2010) have proposed a novel scheme for preparing a class of cluster-like entangled states based on a four-particle projective measurement. In this paper, we put forward a new and optimal scheme to realize the remote preparation for this class of cluster-like states with the aid of two bipartite partially entangled channels. Different from the previous scheme, we employ a two-particle projective measurement instead of the four-particle projective measurement during the preparation. Besides, the resource consumptions are computed in our scheme, which include classical communication cost and quantum resource consumptions. Moreover, we have some discussions on the features of our scheme and make some comparisons on resource consumptions and operation complexity between the previous scheme and ours. The results show that our scheme is more economic and feasible compared with the previous.
NASA Astrophysics Data System (ADS)
Liu, George S.; Kim, Jinkyung; Applegate, Brian E.; Oghalai, John S.
2017-07-01
Diseases that cause hearing loss and/or vertigo in humans such as Meniere's disease are often studied using animal models. The volume of endolymph within the inner ear varies with these diseases. Here, we used a mouse model of increased endolymph volume, endolymphatic hydrops, to develop a computer-aided objective approach to measure endolymph volume from images collected in vivo using optical coherence tomography. The displacement of Reissner's membrane from its normal position was measured in cochlear cross sections. We validated our computer-aided measurements with manual measurements and with trained observer labels. This approach allows for computer-aided detection of endolymphatic hydrops in mice, with test performance showing sensitivity of 91% and specificity of 87% using a running average of five measurements. These findings indicate that this approach is accurate and reliable for classifying endolymphatic hydrops and quantifying endolymph volume.
2006-06-01
Hadjiiski, and N. Petrick, "Computerized nipple identification for multiple image analysis in computer-aided diagnosis," Medical Physics 31, 2871...candidates, 3 identification of suspicious objects, 4 feature extraction and analysis, and 5 FP reduc- tion by classification of normal tissue...detection of microcalcifi- cations on digitized mammograms.41 An illustration of a La- placian decomposition tree is shown on the left-hand side of Fig. 4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Kenji; Yoshida, Hiroyuki; Naeppi, Janne
2006-10-15
One of the limitations of the current computer-aided detection (CAD) of polyps in CT colonography (CTC) is a relatively large number of false-positive (FP) detections. Rectal tubes (RTs) are one of the typical sources of FPs because a portion of a RT, especially a portion of a bulbous tip, often exhibits a cap-like shape that closely mimics the appearance of a small polyp. Radiologists can easily recognize and dismiss RT-induced FPs; thus, they may lose their confidence in CAD as an effective tool if the CAD scheme generates such ''obvious'' FPs due to RTs consistently. In addition, RT-induced FPs maymore » distract radiologists from less common true positives in the rectum. Therefore, removal RT-induced FPs as well as other types of FPs is desirable while maintaining a high sensitivity in the detection of polyps. We developed a three-dimensional (3D) massive-training artificial neural network (MTANN) for distinction between polyps and RTs in 3D CTC volumetric data. The 3D MTANN is a supervised volume-processing technique which is trained with input CTC volumes and the corresponding ''teaching'' volumes. The teaching volume for a polyp contains a 3D Gaussian distribution, and that for a RT contains zeros for enhancement of polyps and suppression of RTs, respectively. For distinction between polyps and nonpolyps including RTs, a 3D scoring method based on a 3D Gaussian weighting function is applied to the output of the trained 3D MTANN. Our database consisted of CTC examinations of 73 patients, scanned in both supine and prone positions (146 CTC data sets in total), with optical colonoscopy as a reference standard for the presence of polyps. Fifteen patients had 28 polyps, 15 of which were 5-9 mm and 13 were 10-25 mm in size. These CTC cases were subjected to our previously reported CAD scheme that included centerline-based segmentation of the colon, shape-based detection of polyps, and reduction of FPs by use of a Bayesian neural network based on geometric and texture features. Application of this CAD scheme yielded 96.4% (27/28) by-polyp sensitivity with 3.1 (224/73) FPs per patient, among which 20 FPs were caused by RTs. To eliminate the FPs due to RTs and possibly other normal structures, we trained a 3D MTANN with ten representative polyps and ten RTs, and applied the trained 3D MTANN to the above CAD true- and false-positive detections. In the output volumes of the 3D MTANN, polyps were represented by distributions of bright voxels, whereas RTs and other normal structures partly similar to RTs appeared as darker voxels, indicating the ability of the 3D MTANN to suppress RTs as well as other normal structures effectively. Application of the 3D MTANN to the CAD detections showed that the 3D MTANN eliminated all RT-induced 20 FPs, as well as 53 FPs due to other causes, without removal of any true positives. Overall, the 3D MTANN was able to reduce the FP rate of the CAD scheme from 3.1 to 2.1 FPs per patient (33% reduction), while the original by-polyp sensitivity of 96.4% was maintained.« less
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2013-01-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899
A concatenated coding scheme for error control
NASA Technical Reports Server (NTRS)
Lin, S.
1985-01-01
A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.
Image Motion Detection And Estimation: The Modified Spatio-Temporal Gradient Scheme
NASA Astrophysics Data System (ADS)
Hsin, Cheng-Ho; Inigo, Rafael M.
1990-03-01
The detection and estimation of motion are generally involved in computing a velocity field of time-varying images. A completely new modified spatio-temporal gradient scheme to determine motion is proposed. This is derived by using gradient methods and properties of biological vision. A set of general constraints is proposed to derive motion constraint equations. The constraints are that the second directional derivatives of image intensity at an edge point in the smoothed image will be constant at times t and t+L . This scheme basically has two stages: spatio-temporal filtering, and velocity estimation. Initially, image sequences are processed by a set of oriented spatio-temporal filters which are designed using a Gaussian derivative model. The velocity is then estimated for these filtered image sequences based on the gradient approach. From a computational stand point, this scheme offers at least three advantages over current methods. The greatest advantage of the modified spatio-temporal gradient scheme over the traditional ones is that an infinite number of motion constraint equations are derived instead of only one. Therefore, it solves the aperture problem without requiring any additional assumptions and is simply a local process. The second advantage is that because of the spatio-temporal filtering, the direct computation of image gradients (discrete derivatives) is avoided. Therefore the error in gradients measurement is reduced significantly. The third advantage is that during the processing of motion detection and estimation algorithm, image features (edges) are produced concurrently with motion information. The reliable range of detected velocity is determined by parameters of the oriented spatio-temporal filters. Knowing the velocity sensitivity of a single motion detection channel, a multiple-channel mechanism for estimating image velocity, seldom addressed by other motion schemes in machine vision, can be constructed by appropriately choosing and combining different sets of parameters. By applying this mechanism, a great range of velocity can be detected. The scheme has been tested for both synthetic and real images. The results of simulations are very satisfactory.
From Three-Photon Greenberger-Horne-Zeilinger States to Ballistic Universal Quantum Computation.
Gimeno-Segovia, Mercedes; Shadbolt, Pete; Browne, Dan E; Rudolph, Terry
2015-07-10
Single photons, manipulated using integrated linear optics, constitute a promising platform for universal quantum computation. A series of increasingly efficient proposals have shown linear-optical quantum computing to be formally scalable. However, existing schemes typically require extensive adaptive switching, which is experimentally challenging and noisy, thousands of photon sources per renormalized qubit, and/or large quantum memories for repeat-until-success strategies. Our work overcomes all these problems. We present a scheme to construct a cluster state universal for quantum computation, which uses no adaptive switching, no large memories, and which is at least an order of magnitude more resource efficient than previous passive schemes. Unlike previous proposals, it is constructed entirely from loss-detecting gates and offers a robustness to photon loss. Even without the use of an active loss-tolerant encoding, our scheme naturally tolerates a total loss rate ∼1.6% in the photons detected in the gates. This scheme uses only 3 Greenberger-Horne-Zeilinger states as a resource, together with a passive linear-optical network. We fully describe and model the iterative process of cluster generation, including photon loss and gate failure. This demonstrates that building a linear-optical quantum computer needs to be less challenging than previously thought.
Computer-aided diagnosis and artificial intelligence in clinical imaging.
Shiraishi, Junji; Li, Qiang; Appelbaum, Daniel; Doi, Kunio
2011-11-01
Computer-aided diagnosis (CAD) is rapidly entering the radiology mainstream. It has already become a part of the routine clinical work for the detection of breast cancer with mammograms. The computer output is used as a "second opinion" in assisting radiologists' image interpretations. The computer algorithm generally consists of several steps that may include image processing, image feature analysis, and data classification via the use of tools such as artificial neural networks (ANN). In this article, we will explore these and other current processes that have come to be referred to as "artificial intelligence." One element of CAD, temporal subtraction, has been applied for enhancing interval changes and for suppressing unchanged structures (eg, normal structures) between 2 successive radiologic images. To reduce misregistration artifacts on the temporal subtraction images, a nonlinear image warping technique for matching the previous image to the current one has been developed. Development of the temporal subtraction method originated with chest radiographs, with the method subsequently being applied to chest computed tomography (CT) and nuclear medicine bone scans. The usefulness of the temporal subtraction method for bone scans was demonstrated by an observer study in which reading times and diagnostic accuracy improved significantly. An additional prospective clinical study verified that the temporal subtraction image could be used as a "second opinion" by radiologists with negligible detrimental effects. ANN was first used in 1990 for computerized differential diagnosis of interstitial lung diseases in CAD. Since then, ANN has been widely used in CAD schemes for the detection and diagnosis of various diseases in different imaging modalities, including the differential diagnosis of lung nodules and interstitial lung diseases in chest radiography, CT, and position emission tomography/CT. It is likely that CAD will be integrated into picture archiving and communication systems and will become a standard of care for diagnostic examinations in daily clinical work. Copyright © 2011 Elsevier Inc. All rights reserved.
Colonic polyps: application value of computer-aided detection in computed tomographic colonography.
Zhang, Hui-Mao; Guo, Wei; Liu, Gui-Feng; An, Dong-Hong; Gao, Shuo-Hui; Sun, Li-Bo; Yang, Hai-Shan
2011-02-01
Colonic polyps are frequently encountered in clinics. Computed tomographic colonography (CTC), as a painless and quick detection, has high values in clinics. In this study, we evaluated the application value of computer-aided detection (CAD) in CTC detection of colonic polyps in the Chinese population. CTC was performed with a GE 64-row multidetector computed tomography (MDCT) scanner. Data of 50 CTC patients (39 patients positive for at least one polyp of ≥ 0.5 cm in size and the other 11 patients negative by endoscopic detection) were retrospectively reviewed first without computer-aided detection (CAD) and then with CAD by four radiologists (two were experienced and another two inexperienced) blinded to colonoscopy findings. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of detected colonic polyps, as well as the areas under the ROC curves (Az value) with and without CAD were calculated. CAD increased the overall sensitivity, specificity, positive predictive value, negative predictive value and accuracy of the colonic polyps detected by experienced and inexperienced readers. The sensitivity in detecting small polyps (5 - 9 mm) with CAD in experienced and inexperienced readers increased from 82% and 44% to 93% and 82%, respectively (P > 0.05 and P < 0.001). With the use of CAD, the overall false positive rate and false negative rate for the detection of polyps by experienced and inexperienced readers decreased in different degrees. Among 13 sessile polyps not detected by CAD, two were ≥ 1.0 cm, eleven were 5 - 9 mm in diameter, and nine were flat-shaped lesions. The application of CAD in combination with CTC can increase the ability to detect colonic polyps, particularly for inexperienced readers. However, CAD is of limited value for the detection of flat polyps.
Computer-aided diagnostic detection system of venous beading in retinal images
NASA Astrophysics Data System (ADS)
Yang, Ching-Wen; Ma, DyeJyun; Chao, ShuennChing; Wang, ChuinMu; Wen, Chia-Hsien; Lo, ChienShun; Chung, Pau-Choo; Chang, Chein-I.
2000-05-01
The detection of venous beading in retinal images provides an early sign of diabetic retinopathy and plays an important role as a preprocessing step in diagnosing ocular diseases. We present a computer-aided diagnostic system to automatically detect venous beading of blood vessels. It comprises of two modules, referred to as the blood vessel extraction module and the venus beading detection module. The former uses a bell-shaped Gaussian kernel with 12 azimuths to extract blood vessels while the latter applies a neural network-based shape cognitron to detect venous beading among the extracted blood vessels for diagnosis. Both modules are fully computer-automated. To evaluate the proposed system, 61 retinal images (32 beaded and 29 normal images) are used for performance evaluation.
Discriminative Cooperative Networks for Detecting Phase Transitions
NASA Astrophysics Data System (ADS)
Liu, Ye-Hua; van Nieuwenburg, Evert P. L.
2018-04-01
The classification of states of matter and their corresponding phase transitions is a special kind of machine-learning task, where physical data allow for the analysis of new algorithms, which have not been considered in the general computer-science setting so far. Here we introduce an unsupervised machine-learning scheme for detecting phase transitions with a pair of discriminative cooperative networks (DCNs). In this scheme, a guesser network and a learner network cooperate to detect phase transitions from fully unlabeled data. The new scheme is efficient enough for dealing with phase diagrams in two-dimensional parameter spaces, where we can utilize an active contour model—the snake—from computer vision to host the two networks. The snake, with a DCN "brain," moves and learns actively in the parameter space, and locates phase boundaries automatically.
The Use of Computer-Aided Decision Support Systems for Complex Source Selection Decisions
1989-09-01
unique low noise interferometer developed at Fusetech Inc. by using divided Fabry - Perot fiber optic cells, common- mode rejection, matched path lengths and...potential techniques for a demodulation scheme. They proposed a detailed investigation of the approaches as part of the program. For mine applications
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; Moore, Kathleen; Liu, Hong; Zheng, Bin
2017-03-01
Abdominal obesity is strongly associated with a number of diseases and accurately assessment of subtypes of adipose tissue volume plays a significant role in predicting disease risk, diagnosis and prognosis. The objective of this study is to develop and evaluate a new computer-aided detection (CAD) scheme based on deep learning models to automatically segment subcutaneous fat areas (SFA) and visceral (VFA) fat areas depicting on CT images. A dataset involving CT images from 40 patients were retrospectively collected and equally divided into two independent groups (i.e. training and testing group). The new CAD scheme consisted of two sequential convolutional neural networks (CNNs) namely, Selection-CNN and Segmentation-CNN. Selection-CNN was trained using 2,240 CT slices to automatically select CT slices belonging to abdomen areas and SegmentationCNN was trained using 84,000 fat-pixel patches to classify fat-pixels as belonging to SFA or VFA. Then, data from the testing group was used to evaluate the performance of the optimized CAD scheme. Comparing to manually labelled results, the classification accuracy of CT slices selection generated by Selection-CNN yielded 95.8%, while the accuracy of fat pixel segmentation using Segmentation-CNN yielded 96.8%. Therefore, this study demonstrated the feasibility of using deep learning based CAD scheme to recognize human abdominal section from CT scans and segment SFA and VFA from CT slices with high agreement compared with subjective segmentation results.
Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin
2017-01-01
PURPOSE To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. METHODS An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. RESULTS The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. CONCLUSIONS This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process. PMID:28436410
Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin
2017-01-01
To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process.
Unsupervised iterative detection of land mines in highly cluttered environments.
Batman, Sinan; Goutsias, John
2003-01-01
An unsupervised iterative scheme is proposed for land mine detection in heavily cluttered scenes. This scheme is based on iterating hybrid multispectral filters that consist of a decorrelating linear transform coupled with a nonlinear morphological detector. Detections extracted from the first pass are used to improve results in subsequent iterations. The procedure stops after a predetermined number of iterations. The proposed scheme addresses several weaknesses associated with previous adaptations of morphological approaches to land mine detection. Improvement in detection performance, robustness with respect to clutter inhomogeneities, a completely unsupervised operation, and computational efficiency are the main highlights of the method. Experimental results reveal excellent performance.
Computer-aided interpretation approach for optical tomographic images
NASA Astrophysics Data System (ADS)
Klose, Christian D.; Klose, Alexander D.; Netz, Uwe J.; Scheel, Alexander K.; Beuthan, Jürgen; Hielscher, Andreas H.
2010-11-01
A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) in human finger joints using optical tomographic images. The image interpretation method employs a classification algorithm that makes use of a so-called self-organizing mapping scheme to classify fingers as either affected or unaffected by RA. Unlike in previous studies, this allows for combining multiple image features, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging, and inspection of optical tomographic images), were used to produce ground truth benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities, while others to higher specificities when compared to single parameter classifications employed in previous studies. Maximum performances are reached when combining the minimum/maximum ratio of the absorption coefficient and image variance. In this case, sensitivities and specificities over 0.9 can be achieved. These values are much higher than values obtained when only single parameter classifications were used, where sensitivities and specificities remained well below 0.8.
Study on computer-aided diagnosis of hepatic MR imaging and mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Xuejun
2005-04-01
It is well known that the liver is an organ easily attacked by diseases. The purpose of this study is to develop a computer-aided diagnosis (CAD) scheme for helping radiologists to differentiate hepatic diseases more efficiently. Our software named LIVERANN integrated the magnetic resonance (MR) imaging findings with different pulse sequences to classify the five categories of hepatic diseases by using the artificial neural network (ANN) method. The intensity and homogeneity within the region of interest (ROI) delineated by a radiologist were automatically calculated to obtain numerical data by the program for input signals to the ANN. Outputs were themore » five pathological categories of hepatic diseases (hepatic cyst, hepatocellular carcinoma, dysplasia in cirrhosis, cavernous hemangioma, and metastasis). The experiment demonstrated a testing accuracy of 93% from 80 patients. In order to differentiate the cirrhosis from normal liver, the volume ratio of left to whole (LTW) was proposed to quantify the degree of cirrhosis by three-dimensional (3D) volume analysis. The liver region was firstly extracted from computed tomography (CT) or MR slices based on edge detection algorithms, and then separated into left lobe and right lobe by the hepatic umbilical fissure. The volume ratio of these two parts showed that the LTW ratio in the liver was significantly improved in the differentiation performance, with (25.6%{+-}4.3%) in cirrhosis versus the normal liver (16.4%{+-}5.4%). In addition, the application of the ANN method for detecting clustered microcalcifications in masses on mammograms was described here as well. A new structural ANN, so-called a shift-invariant artificial neural network (SIANN), was integrated with our triple-ring filter (TRF) method in our CAD system. As the result, the sensitivity of detecting clusters was improved from 90% by our previous TRF method to 95% by using both SIANN and TRF.« less
Max-AUC Feature Selection in Computer-Aided Detection of Polyps in CT Colonography
Xu, Jian-Wu; Suzuki, Kenji
2014-01-01
We propose a feature selection method based on a sequential forward floating selection (SFFS) procedure to improve the performance of a classifier in computerized detection of polyps in CT colonography (CTC). The feature selection method is coupled with a nonlinear support vector machine (SVM) classifier. Unlike the conventional linear method based on Wilks' lambda, the proposed method selected the most relevant features that would maximize the area under the receiver operating characteristic curve (AUC), which directly maximizes classification performance, evaluated based on AUC value, in the computer-aided detection (CADe) scheme. We presented two variants of the proposed method with different stopping criteria used in the SFFS procedure. The first variant searched all feature combinations allowed in the SFFS procedure and selected the subsets that maximize the AUC values. The second variant performed a statistical test at each step during the SFFS procedure, and it was terminated if the increase in the AUC value was not statistically significant. The advantage of the second variant is its lower computational cost. To test the performance of the proposed method, we compared it against the popular stepwise feature selection method based on Wilks' lambda for a colonic-polyp database (25 polyps and 2624 nonpolyps). We extracted 75 morphologic, gray-level-based, and texture features from the segmented lesion candidate regions. The two variants of the proposed feature selection method chose 29 and 7 features, respectively. Two SVM classifiers trained with these selected features yielded a 96% by-polyp sensitivity at false-positive (FP) rates of 4.1 and 6.5 per patient, respectively. Experiments showed a significant improvement in the performance of the classifier with the proposed feature selection method over that with the popular stepwise feature selection based on Wilks' lambda that yielded 18.0 FPs per patient at the same sensitivity level. PMID:24608058
Max-AUC feature selection in computer-aided detection of polyps in CT colonography.
Xu, Jian-Wu; Suzuki, Kenji
2014-03-01
We propose a feature selection method based on a sequential forward floating selection (SFFS) procedure to improve the performance of a classifier in computerized detection of polyps in CT colonography (CTC). The feature selection method is coupled with a nonlinear support vector machine (SVM) classifier. Unlike the conventional linear method based on Wilks' lambda, the proposed method selected the most relevant features that would maximize the area under the receiver operating characteristic curve (AUC), which directly maximizes classification performance, evaluated based on AUC value, in the computer-aided detection (CADe) scheme. We presented two variants of the proposed method with different stopping criteria used in the SFFS procedure. The first variant searched all feature combinations allowed in the SFFS procedure and selected the subsets that maximize the AUC values. The second variant performed a statistical test at each step during the SFFS procedure, and it was terminated if the increase in the AUC value was not statistically significant. The advantage of the second variant is its lower computational cost. To test the performance of the proposed method, we compared it against the popular stepwise feature selection method based on Wilks' lambda for a colonic-polyp database (25 polyps and 2624 nonpolyps). We extracted 75 morphologic, gray-level-based, and texture features from the segmented lesion candidate regions. The two variants of the proposed feature selection method chose 29 and 7 features, respectively. Two SVM classifiers trained with these selected features yielded a 96% by-polyp sensitivity at false-positive (FP) rates of 4.1 and 6.5 per patient, respectively. Experiments showed a significant improvement in the performance of the classifier with the proposed feature selection method over that with the popular stepwise feature selection based on Wilks' lambda that yielded 18.0 FPs per patient at the same sensitivity level.
Use of Parallel Micro-Platform for the Simulation the Space Exploration
NASA Astrophysics Data System (ADS)
Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen
The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.
Toward the detection of abnormal chest radiographs the way radiologists do it
NASA Astrophysics Data System (ADS)
Alzubaidi, Mohammad; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.
2011-03-01
Computer Aided Detection (CADe) and Computer Aided Diagnosis (CADx) are relatively recent areas of research that attempt to employ feature extraction, pattern recognition, and machine learning algorithms to aid radiologists in detecting and diagnosing abnormalities in medical images. However, these computational methods are based on the assumption that there are distinct classes of abnormalities, and that each class has some distinguishing features that set it apart from other classes. However, abnormalities in chest radiographs tend to be very heterogeneous. The literature suggests that thoracic (chest) radiologists develop their ability to detect abnormalities by developing a sense of what is normal, so that anything that is abnormal attracts their attention. This paper discusses an approach to CADe that is based on a technique called anomaly detection (which aims to detect outliers in data sets) for the purpose of detecting atypical regions in chest radiographs. However, in order to apply anomaly detection to chest radiographs, it is necessary to develop a basis for extracting features from corresponding anatomical locations in different chest radiographs. This paper proposes a method for doing this, and describes how it can be used to support CADe.
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
Texture classification of lung computed tomography images
NASA Astrophysics Data System (ADS)
Pheng, Hang See; Shamsuddin, Siti M.
2013-03-01
Current development of algorithms in computer-aided diagnosis (CAD) scheme is growing rapidly to assist the radiologist in medical image interpretation. Texture analysis of computed tomography (CT) scans is one of important preliminary stage in the computerized detection system and classification for lung cancer. Among different types of images features analysis, Haralick texture with variety of statistical measures has been used widely in image texture description. The extraction of texture feature values is essential to be used by a CAD especially in classification of the normal and abnormal tissue on the cross sectional CT images. This paper aims to compare experimental results using texture extraction and different machine leaning methods in the classification normal and abnormal tissues through lung CT images. The machine learning methods involve in this assessment are Artificial Immune Recognition System (AIRS), Naive Bayes, Decision Tree (J48) and Backpropagation Neural Network. AIRS is found to provide high accuracy (99.2%) and sensitivity (98.0%) in the assessment. For experiments and testing purpose, publicly available datasets in the Reference Image Database to Evaluate Therapy Response (RIDER) are used as study cases.
NASA Astrophysics Data System (ADS)
Gur, David
2018-03-01
We tested whether a case based CADe scheme, developed only on negatively interpreted screening mammograms, has predictive value for cancer detection during subsequent screening and how this approach may affect radiologists' performances when alerting them to a small subset ( 15%) of exams on which radiologists tend to miss cancers. A series of six parameters case based CADe schemes, using 200 negative mammograms (800 images 100 women with breast cancer at subsequent screening and 100 women who remained negative), carefully matched by age and breast density, were optimized. CADe alone schemes performed at AUC=0.68 (+/- 0.01). Five radiologists and 4 residents interpreted the same cases and performed at AUC =0.71 (experienced radiologists) and AUC= 0.61 (residents). With the "CADe warnings" shown to the interpreters only if they did not recall one of 24 highest CADe scoring cases, assisted performance of radiologists and residents respectively, were 0.71 and 0.63 (p>0.05). However, when the CADe alone performance was raised to an AUC=0.78, by artificially increasing the number of possible warnings from 16 to 24, radiologists' performances significantly improved from an AUC of 0.68 to 0.72 (p<0.05). In conclusion, the use case based information other than breast density could highlight a small fraction of women whose cancers are more likely to be missed by radiologists and later detected during subsequent mammograms, thereby, leading to an assisted approach that improves radiologists' performances. However, to be effective, the performance of the CADe alone should be substantially higher (e.g. ΔAUC >=0.07) than that of the un-assisted radiologist.
DOT National Transportation Integrated Search
1989-01-01
Future levels of air traffic control automation plan to incorporate computer aiding features designed to alert the controller to upcoming problem situations by displaying information that will identify the situation and suggest possible solutions. Co...
2009-06-01
131 cases with 131 biopsy proven masses, of which 27 were malignant and 104 benign. The true locations of the masses were identified by an experi- enced ...two acquisitions would cause differ- ences in the subtlety of the masses on the FFDMs and SFMs. However, assuming that the differences are ran- dom... Lado , M. Souto, and J. J. Vidal, “Computer-aided diagnosis: Automatic detection of malignant masses in digitized mammograms,” Med. Phys. 25, 957–964
Reducing false-positive detections by combining two stage-1 computer-aided mass detection algorithms
NASA Astrophysics Data System (ADS)
Bedard, Noah D.; Sampat, Mehul P.; Stokes, Patrick A.; Markey, Mia K.
2006-03-01
In this paper we present a strategy for reducing the number of false-positives in computer-aided mass detection. Our approach is to only mark "consensus" detections from among the suspicious sites identified by different "stage-1" detection algorithms. By "stage-1" we mean that each of the Computer-aided Detection (CADe) algorithms is designed to operate with high sensitivity, allowing for a large number of false positives. In this study, two mass detection methods were used: (1) Heath and Bowyer's algorithm based on the average fraction under the minimum filter (AFUM) and (2) a low-threshold bi-lateral subtraction algorithm. The two methods were applied separately to a set of images from the Digital Database for Screening Mammography (DDSM) to obtain paired sets of mass candidates. The consensus mass candidates for each image were identified by a logical "and" operation of the two CADe algorithms so as to eliminate regions of suspicion that were not independently identified by both techniques. It was shown that by combining the evidence from the AFUM filter method with that obtained from bi-lateral subtraction, the same sensitivity could be reached with fewer false-positives per image relative to using the AFUM filter alone.
Computer-Aided Detection of Prostate Cancer with MRI: Technology and Applications
Liu, Lizhi; Tian, Zhiqiang; Zhang, Zhenfeng; Fei, Baowei
2016-01-01
One in six men will develop prostate cancer in his life time. Early detection and accurate diagnosis of the disease can improve cancer survival and reduce treatment costs. Recently, imaging of prostate cancer has greatly advanced since the introduction of multi-parametric magnetic resonance imaging (mp-MRI). Mp-MRI consists of T2-weighted sequences combined with functional sequences including dynamic contrast-enhanced MRI, diffusion-weighted MRI, and MR spectroscopy imaging. Due to the big data and variations in imaging sequences, detection can be affected by multiple factors such as observer variability and visibility and complexity of the lesions. In order to improve quantitative assessment of the disease, various computer-aided detection systems have been designed to help radiologists in their clinical practice. This review paper presents an overview of literatures on computer-aided detection of prostate cancer with mp-MRI, which include the technology and its applications. The aim of the survey is threefold: an introduction for those new to the field, an overview for those working in the field, and a reference for those searching for literature on a specific application. PMID:27133005
Development of a new first-aid biochemical detector
NASA Astrophysics Data System (ADS)
Hu, Jingfei; Liao, Haiyang; Su, Shilin; Ding, Hao; Liu, Suquan
2016-10-01
The traditional biochemical detector exhibits poor adaptability, inconvenient carrying and slow detection, which can't meet the needs of first-aid under field condition like natural or man-made disasters etc. Therefore a scheme of first-aid biochemical detector based on MOMES Micro Spectrometer, UV LED and Photodiode was proposed. An optical detection structure combined continuous spectrum sweep with fixed wavelength measurement was designed, which adopted mobile detection optical path consisting of Micro Spectrometer and Halogen Lamp to detect Chloride (Cl-), Creatinine (Cre), Glucose (Glu), Hemoglobin (Hb). The UV LED and Photodiode were designed to detect Potassium (K-), Carbon dioxide (CO2), Sodium (Na+). According to the field diagnosis and treatment requirements, we designed the embedded control hardware circuit and software system, the prototype of first-aid biochemical detector was developed and the clinical trials were conducted. Experimental results show that the sample's absorbance repeatability is less than 2%, the max coefficient of variation (CV) in the batch repeatability test of all 7 biochemical parameters in blood samples is 4.68%, less than the clinical requirements 10%, the correlation coefficient (R2) in the clinical contrast test with AU5800 is almost greater than 0.97. To sum up, the prototype meets the requirements of clinical application.
Induction-detection electron spin resonance with spin sensitivity of a few tens of spins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Artzi, Yaron; Twig, Ygal; Blank, Aharon
2015-02-23
Electron spin resonance (ESR) is a spectroscopic method that addresses electrons in paramagnetic materials directly through their spin properties. ESR has many applications, ranging from semiconductor characterization to structural biology and even quantum computing. Although it is very powerful and informative, ESR traditionally suffers from low sensitivity, requiring many millions of spins to get a measureable signal with commercial systems using the Faraday induction-detection principle. In view of this disadvantage, significant efforts were made recently to develop alternative detection schemes based, for example, on force, optical, or electrical detection of spins, all of which can reach single electron spin sensitivity.more » This sensitivity, however, comes at the price of limited applicability and usefulness with regard to real scientific and technological issues facing modern ESR which are currently dealt with conventional induction-detection ESR on a daily basis. Here, we present the most sensitive experimental induction-detection ESR setup and results ever recorded that can detect the signal from just a few tens of spins. They were achieved thanks to the development of an ultra-miniature micrometer-sized microwave resonator that was operated at ∼34 GHz at cryogenic temperatures in conjunction with a unique cryogenically cooled low noise amplifier. The test sample used was isotopically enriched phosphorus-doped silicon, which is of significant relevance to spin-based quantum computing. The sensitivity was experimentally verified with the aid of a unique high-resolution ESR imaging approach. These results represent a paradigm shift with respect to the capabilities and possible applications of induction-detection-based ESR spectroscopy and imaging.« less
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1997-01-01
Topics considered include: high-performance computing; cognitive and perceptual prostheses (computational aids designed to leverage human abilities); autonomous systems. Also included: development of a 3D unstructured grid code based on a finite volume formulation and applied to the Navier-stokes equations; Cartesian grid methods for complex geometry; multigrid methods for solving elliptic problems on unstructured grids; algebraic non-overlapping domain decomposition methods for compressible fluid flow problems on unstructured meshes; numerical methods for the compressible navier-stokes equations with application to aerodynamic flows; research in aerodynamic shape optimization; S-HARP: a parallel dynamic spectral partitioner; numerical schemes for the Hamilton-Jacobi and level set equations on triangulated domains; application of high-order shock capturing schemes to direct simulation of turbulence; multicast technology; network testbeds; supercomputer consolidation project.
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2012-07-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Zheng, Bin; Cheng, Samuel
2016-03-01
Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) has been used increasingly in breast cancer diagnosis and assessment of cancer treatment efficacy. In this study, we applied a computer-aided detection (CAD) scheme to automatically segment breast regions depicting on MR images and used the kinetic image features computed from the global breast MR images acquired before neoadjuvant chemotherapy to build a new quantitative model to predict response of the breast cancer patients to the chemotherapy. To assess performance and robustness of this new prediction model, an image dataset involving breast MR images acquired from 151 cancer patients before undergoing neoadjuvant chemotherapy was retrospectively assembled and used. Among them, 63 patients had "complete response" (CR) to chemotherapy in which the enhanced contrast levels inside the tumor volume (pre-treatment) was reduced to the level as the normal enhanced background parenchymal tissues (post-treatment), while 88 patients had "partially response" (PR) in which the high contrast enhancement remain in the tumor regions after treatment. We performed the studies to analyze the correlation among the 22 global kinetic image features and then select a set of 4 optimal features. Applying an artificial neural network trained with the fusion of these 4 kinetic image features, the prediction model yielded an area under ROC curve (AUC) of 0.83+/-0.04. This study demonstrated that by avoiding tumor segmentation, which is often difficult and unreliable, fusion of kinetic image features computed from global breast MR images without tumor segmentation can also generate a useful clinical marker in predicting efficacy of chemotherapy.
An Intrinsically Digital Amplification Scheme for Hearing Aids
NASA Astrophysics Data System (ADS)
Blamey, Peter J.; Macfarlane, David S.; Steele, Brenton R.
2005-12-01
Results for linear and wide-dynamic range compression were compared with a new 64-channel digital amplification strategy in three separate studies. The new strategy addresses the requirements of the hearing aid user with efficient computations on an open-platform digital signal processor (DSP). The new amplification strategy is not modeled on prior analog strategies like compression and linear amplification, but uses statistical analysis of the signal to optimize the output dynamic range in each frequency band independently. Using the open-platform DSP processor also provided the opportunity for blind trial comparisons of the different processing schemes in BTE and ITE devices of a high commercial standard. The speech perception scores and questionnaire results show that it is possible to provide improved audibility for sound in many narrow frequency bands while simultaneously improving comfort, speech intelligibility in noise, and sound quality.
A concatenated coding scheme for error control
NASA Technical Reports Server (NTRS)
Kasami, T.; Fujiwara, T.; Lin, S.
1986-01-01
In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.
ERIC Educational Resources Information Center
Concheiro, A. Alonso, Ed.; And Others
The following papers in English from this international conference may be of particular interest to those in the field of education. T. Nakahara, A. Tsukamota, and M. Matsumoto describe a computer-aided design technique for an economical urban cable television system. W. D. Wasson and R. K. Chitkara outline a recognition scheme based on analysis…
Deep learning of symmetrical discrepancies for computer-aided detection of mammographic masses
NASA Astrophysics Data System (ADS)
Kooi, Thijs; Karssemeijer, Nico
2017-03-01
When humans identify objects in images, context is an important cue; a cheetah is more likely to be a domestic cat when a television set is recognised in the background. Similar principles apply to the analysis of medical images. The detection of diseases that manifest unilaterally in symmetrical organs or organ pairs can in part be facilitated by a search for symmetrical discrepancies in or between the organs in question. During a mammographic exam, images are recorded of each breast and absence of a certain structure around the same location in the contralateral image will render the area under scrutiny more suspicious and conversely, the presence of similar tissue less so. In this paper, we present a fusion scheme for a deep Convolutional Neural Network (CNN) architecture with the goal to optimally capture such asymmetries. The method is applied to the domain of mammography CAD, but can be relevant to other medical image analysis tasks where symmetry is important such as lung, prostate or brain images.
Computational technique for stepwise quantitative assessment of equation correctness
NASA Astrophysics Data System (ADS)
Othman, Nuru'l Izzah; Bakar, Zainab Abu
2017-04-01
Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.
Biologically inspired binaural hearing aid algorithms: Design principles and effectiveness
NASA Astrophysics Data System (ADS)
Feng, Albert
2002-05-01
Despite rapid advances in the sophistication of hearing aid technology and microelectronics, listening in noise remains problematic for people with hearing impairment. To solve this problem two algorithms were designed for use in binaural hearing aid systems. The signal processing strategies are based on principles in auditory physiology and psychophysics: (a) the location/extraction (L/E) binaural computational scheme determines the directions of source locations and cancels noise by applying a simple subtraction method over every frequency band; and (b) the frequency-domain minimum-variance (FMV) scheme extracts a target sound from a known direction amidst multiple interfering sound sources. Both algorithms were evaluated using standard metrics such as signal-to-noise-ratio gain and articulation index. Results were compared with those from conventional adaptive beam-forming algorithms. In free-field tests with multiple interfering sound sources our algorithms performed better than conventional algorithms. Preliminary intelligibility and speech reception results in multitalker environments showed gains for every listener with normal or impaired hearing when the signals were processed in real time with the FMV binaural hearing aid algorithm. [Work supported by NIH-NIDCD Grant No. R21DC04840 and the Beckman Institute.
Macedo, Gleicy A.; Gonin, Michelle Luiza C.; Pone, Sheila M.; Cruz, Oswaldo G.; Nobre, Flávio F.; Brasil, Patrícia
2014-01-01
Background The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Principal Findings Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. Conclusions/Significance This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction. PMID:24777054
Macedo, Gleicy A; Gonin, Michelle Luiza C; Pone, Sheila M; Cruz, Oswaldo G; Nobre, Flávio F; Brasil, Patrícia
2014-01-01
The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction.
NASA Astrophysics Data System (ADS)
Horiba, Kazuki; Muramatsu, Chisako; Hayashi, Tatsuro; Fukui, Tatsumasa; Hara, Takeshi; Katsumata, Akitoshi; Fujita, Hiroshi
2015-03-01
Findings on dental panoramic radiographs (DPRs) have shown that mandibular cortical index (MCI) based on the morphology of mandibular inferior cortex was significantly correlated with osteoporosis. MCI on DPRs can be categorized into one of three groups and has the high potential for identifying patients with osteoporosis. However, most DPRs are used only for diagnosing dental conditions by dentists in their routine clinical work. Moreover, MCI is not generally quantified but assessed subjectively. In this study, we investigated a computer-aided diagnosis (CAD) system that automatically classifies mandibular cortical bone for detection of osteoporotic patients at early stage. First, an inferior border of mandibular bone was detected by use of an active contour method. Second, regions of interest including the cortical bone are extracted and analyzed for its thickness and roughness. Finally, support vector machine (SVM) differentiate cases into three MCI categories by features including the thickness and roughness. Ninety eight DPRs were used to evaluate our proposed scheme. The number of cases classified to Class I, II, and III by a dental radiologist are 56, 25 and 17 cases, respectively. Experimental result based on the leave-one-out cross-validation evaluation showed that the sensitivities for the classes I, II, and III were 94.6%, 57.7% and 94.1%, respectively. Distribution of the groups in the feature space indicates a possibility of MCI quantification by the proposed method. Therefore, our scheme has a potential in identifying osteoporotic patients at an early stage.
On resilience studies of system detection and recovery techniques against stealthy insider attacks
NASA Astrophysics Data System (ADS)
Wei, Sixiao; Zhang, Hanlin; Chen, Genshe; Shen, Dan; Yu, Wei; Pham, Khanh D.; Blasch, Erik P.; Cruz, Jose B.
2016-05-01
With the explosive growth of network technologies, insider attacks have become a major concern to business operations that largely rely on computer networks. To better detect insider attacks that marginally manipulate network traffic over time, and to recover the system from attacks, in this paper we implement a temporal-based detection scheme using the sequential hypothesis testing technique. Two hypothetical states are considered: the null hypothesis that the collected information is from benign historical traffic and the alternative hypothesis that the network is under attack. The objective of such a detection scheme is to recognize the change within the shortest time by comparing the two defined hypotheses. In addition, once the attack is detected, a server migration-based system recovery scheme can be triggered to recover the system to the state prior to the attack. To understand mitigation of insider attacks, a multi-functional web display of the detection analysis was developed for real-time analytic. Experiments using real-world traffic traces evaluate the effectiveness of Detection System and Recovery (DeSyAR) scheme. The evaluation data validates the detection scheme based on sequential hypothesis testing and the server migration-based system recovery scheme can perform well in effectively detecting insider attacks and recovering the system under attack.
A cache-aided multiprocessor rollback recovery scheme
NASA Technical Reports Server (NTRS)
Wu, Kun-Lung; Fuchs, W. Kent
1989-01-01
This paper demonstrates how previous uniprocessor cache-aided recovery schemes can be applied to multiprocessor architectures, for recovering from transient processor failures, utilizing private caches and a global shared memory. As with cache-aided uniprocessor recovery, the multiprocessor cache-aided recovery scheme of this paper can be easily integrated into standard bus-based snoopy cache coherence protocols. A consistent shared memory state is maintained without the necessity of global check-pointing.
Computer aided detection system for lung cancer using computer tomography scans
NASA Astrophysics Data System (ADS)
Mahesh, Shanthi; Rakesh, Spoorthi; Patil, Vidya C.
2018-04-01
Lung Cancer is a disease can be defined as uncontrolled cell growth in tissues of the lung. If we detect the Lung Cancer in its early stage, then that could be the key of its cure. In this work the non-invasive methods are studied for assisting in nodule detection. It supplies a Computer Aided Diagnosis System (CAD) for early detection of lung cancer nodules from the Computer Tomography (CT) images. CAD system is the one which helps to improve the diagnostic performance of radiologists in their image interpretations. The main aim of this technique is to develop a CAD system for finding the lung cancer using the lung CT images and classify the nodule as Benign or Malignant. For classifying cancer cells, SVM classifier is used. Here, image processing techniques have been used to de-noise, to enhance, for segmentation and edge detection of an image is used to extract the area, perimeter and shape of nodule. The core factors of this research are Image quality and accuracy.
Effect of segmentation algorithms on the performance of computerized detection of lung nodules in CT
Guo, Wei; Li, Qiang
2014-01-01
Purpose: The purpose of this study is to reveal how the performance of lung nodule segmentation algorithm impacts the performance of lung nodule detection, and to provide guidelines for choosing an appropriate segmentation algorithm with appropriate parameters in a computer-aided detection (CAD) scheme. Methods: The database consisted of 85 CT scans with 111 nodules of 3 mm or larger in diameter from the standard CT lung nodule database created by the Lung Image Database Consortium. The initial nodule candidates were identified as those with strong response to a selective nodule enhancement filter. A uniform viewpoint reformation technique was applied to a three-dimensional nodule candidate to generate 24 two-dimensional (2D) reformatted images, which would be used to effectively distinguish between true nodules and false positives. Six different algorithms were employed to segment the initial nodule candidates in the 2D reformatted images. Finally, 2D features from the segmented areas in the 24 reformatted images were determined, selected, and classified for removal of false positives. Therefore, there were six similar CAD schemes, in which only the segmentation algorithms were different. The six segmentation algorithms included the fixed thresholding (FT), Otsu thresholding (OTSU), fuzzy C-means (FCM), Gaussian mixture model (GMM), Chan and Vese model (CV), and local binary fitting (LBF). The mean Jaccard index and the mean absolute distance (Dmean) were employed to evaluate the performance of segmentation algorithms, and the number of false positives at a fixed sensitivity was employed to evaluate the performance of the CAD schemes. Results: For the segmentation algorithms of FT, OTSU, FCM, GMM, CV, and LBF, the highest mean Jaccard index between the segmented nodule and the ground truth were 0.601, 0.586, 0.588, 0.563, 0.543, and 0.553, respectively, and the corresponding Dmean were 1.74, 1.80, 2.32, 2.80, 3.48, and 3.18 pixels, respectively. With these segmentation results of the six segmentation algorithms, the six CAD schemes reported 4.4, 8.8, 3.4, 9.2, 13.6, and 10.4 false positives per CT scan at a sensitivity of 80%. Conclusions: When multiple algorithms are available for segmenting nodule candidates in a CAD scheme, the “optimal” segmentation algorithm did not necessarily lead to the “optimal” CAD detection performance. PMID:25186393
Computer-Aided Methodology for Syndromic Strabismus Diagnosis.
Sousa de Almeida, João Dallyson; Silva, Aristófanes Corrêa; Teixeira, Jorge Antonio Meireles; Paiva, Anselmo Cardoso; Gattass, Marcelo
2015-08-01
Strabismus is a pathology that affects approximately 4 % of the population, causing aesthetic problems reversible at any age and irreversible sensory alterations that modify the vision mechanism. The Hirschberg test is one type of examination for detecting this pathology. Computer-aided detection/diagnosis is being used with relative success to aid health professionals. Nevertheless, the routine use of high-tech devices for aiding ophthalmological diagnosis and therapy is not a reality within the subspecialty of strabismus. Thus, this work presents a methodology to aid in diagnosis of syndromic strabismus through digital imaging. Two hundred images belonging to 40 patients previously diagnosed by an specialist were tested. The method was demonstrated to be 88 % accurate in esotropias identification (ET), 100 % for exotropias (XT), 80.33 % for hypertropias (HT), and 83.33 % for hypotropias (HoT). The overall average error was 5.6Δ and 3.83Δ for horizontal and vertical deviations, respectively, against the measures presented by the specialist.
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1982-01-01
A numerical scheme is developed for solving the time dependent, three dimensional compressible viscous flow equations to be used as an aid in the design of helicopter rotors. In order to further investigate the numerical procedure, the computer code developed to solve an approximate form of the three dimensional unsteady Navier-Stokes equations employing a linearized block implicit technique in conjunction with a QR operator scheme is tested. Results of calculations are presented for several two dimensional boundary layer flows including steady turbulent and unsteady laminar cases. A comparison of fourth order and second order solutions indicate that increased accuracy can be obtained without any significant increases in cost (run time). The results of the computations also indicate that the computer code can be applied to more complex flows such as those encountered on rotating airfoils. The geometry of a symmetric NACA four digit airfoil is considered and the appropriate geometrical properties are computed.
Computer-aided Detection of Prostate Cancer with MRI: Technology and Applications.
Liu, Lizhi; Tian, Zhiqiang; Zhang, Zhenfeng; Fei, Baowei
2016-08-01
One in six men will develop prostate cancer in his lifetime. Early detection and accurate diagnosis of the disease can improve cancer survival and reduce treatment costs. Recently, imaging of prostate cancer has greatly advanced since the introduction of multiparametric magnetic resonance imaging (mp-MRI). Mp-MRI consists of T2-weighted sequences combined with functional sequences including dynamic contrast-enhanced MRI, diffusion-weighted MRI, and magnetic resonance spectroscopy imaging. Because of the big data and variations in imaging sequences, detection can be affected by multiple factors such as observer variability and visibility and complexity of the lesions. To improve quantitative assessment of the disease, various computer-aided detection systems have been designed to help radiologists in their clinical practice. This review paper presents an overview of literatures on computer-aided detection of prostate cancer with mp-MRI, which include the technology and its applications. The aim of the survey is threefold: an introduction for those new to the field, an overview for those working in the field, and a reference for those searching for literature on a specific application. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Computer-Aided Diagnostic System For Mass Survey Chest Images
NASA Astrophysics Data System (ADS)
Yasuda, Yoshizumi; Kinoshita, Yasuhiro; Emori, Yasufumi; Yoshimura, Hitoshi
1988-06-01
In order to support screening of chest radiographs on mass survey, a computer-aided diagnostic system that automatically detects abnormality of candidate images using a digital image analysis technique has been developed. Extracting boundary lines of lung fields and examining their shapes allowed various kind of abnormalities to be detected. Correction and expansion were facilitated by describing the system control, image analysis control and judgement of abnormality in the rule type programing language. In the experiments using typical samples of student's radiograms, good results were obtained for the detection of abnormal shape of lung field, cardiac hypertrophy and scoliosis. As for the detection of diaphragmatic abnormality, relatively good results were obtained but further improvements will be necessary.
Symbolic manipulation techniques for vibration analysis of laminated elliptic plates
NASA Technical Reports Server (NTRS)
Andersen, C. M.; Noor, A. K.
1977-01-01
A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.
Computer-aided prediction of xenobiotic metabolism in the human body
NASA Astrophysics Data System (ADS)
Bezhentsev, V. M.; Tarasova, O. A.; Dmitriev, A. V.; Rudik, A. V.; Lagunin, A. A.; Filimonov, D. A.; Poroikov, V. V.
2016-08-01
The review describes the major databases containing information about the metabolism of xenobiotics, including data on drug metabolism, metabolic enzymes, schemes of biotransformation and the structures of some substrates and metabolites. Computational approaches used to predict the interaction of xenobiotics with metabolic enzymes, prediction of metabolic sites in the molecule, generation of structures of potential metabolites for subsequent evaluation of their properties are considered. The advantages and limitations of various computational methods for metabolism prediction and the prospects for their applications to improve the safety and efficacy of new drugs are discussed. Bibliography — 165 references.
Frost Forecasting for Fruitgrowers
NASA Technical Reports Server (NTRS)
Martsolf, J. D.; Chen, E.
1983-01-01
Progress in forecasting from satellite data reviewed. University study found data from satellites displayed in color and used to predict frost are valuable aid to agriculture. Study evaluated scheme to use Earth-temperature data from Geostationary Operational Environmental Satellite in computer model that determines when and where freezing temperatures endanger developing fruit crops, such as apples, peaches and cherries in spring and citrus crops in winter.
2007-06-01
the masses were identified by an experi- enced Mammography Quality Standards Act (MQSA) radiologist. The no-mass data set contained 98 cases. The time...force, and the difference in time between the two acquisitions would cause differ- ences in the subtlety of the masses on the FFDMs and SFMs. However...images," Medical Physics 18, 955-963 (1991). 20A. J. Mendez, P. G. Tahoces, M. J. Lado , M. Souto, and J. J. Vidal, "Computer-aided diagnosis: Automatic
Zhang, Z L; Li, J P; Li, G; Ma, X C
2017-02-09
Objective: To establish and validate a computer program used to aid the detection of dental proximal caries in the images cone beam computed tomography (CBCT) images. Methods: According to the characteristics of caries lesions in X-ray images, a computer aided detection program for proximal caries was established with Matlab and Visual C++. The whole process for caries lesion detection included image import and preprocessing, measuring average gray value of air area, choosing region of interest and calculating gray value, defining the caries areas. The program was used to examine 90 proximal surfaces from 45 extracted human teeth collected from Peking University School and Hospital of Stomatology. The teeth were then scanned with a CBCT scanner (Promax 3D). The proximal surfaces of the teeth were respectively detected by caries detection program and scored by human observer for the extent of lesions with 6-level-scale. With histologic examination serving as the reference standard, the caries detection program and the human observer performances were assessed with receiver operating characteristic (ROC) curves. Student t -test was used to analyze the areas under the ROC curves (AUC) for the differences between caries detection program and human observer. Spearman correlation coefficient was used to analyze the detection accuracy of caries depth. Results: For the diagnosis of proximal caries in CBCT images, the AUC values of human observers and caries detection program were 0.632 and 0.703, respectively. There was a statistically significant difference between the AUC values ( P= 0.023). The correlation between program performance and gold standard (correlation coefficient r (s)=0.525) was higher than that of observer performance and gold standard ( r (s)=0.457) and there was a statistically significant difference between the correlation coefficients ( P= 0.000). Conclusions: The program that automatically detects dental proximal caries lesions could improve the diagnostic value of CBCT images.
NASA Astrophysics Data System (ADS)
Fetita, C.; Chang-Chien, K. C.; Brillet, P. Y.; Pr"teux, F.; Chang, R. F.
2012-03-01
Our study aims at developing a computer-aided diagnosis (CAD) system for fully automatic detection and classification of pathological lung parenchyma patterns in idiopathic interstitial pneumonias (IIP) and emphysema using multi-detector computed tomography (MDCT). The proposed CAD system is based on three-dimensional (3-D) mathematical morphology, texture and fuzzy logic analysis, and can be divided into four stages: (1) a multi-resolution decomposition scheme based on a 3-D morphological filter was exploited to discriminate the lung region patterns at different analysis scales. (2) An additional spatial lung partitioning based on the lung tissue texture was introduced to reinforce the spatial separation between patterns extracted at the same resolution level in the decomposition pyramid. Then, (3) a hierarchic tree structure was exploited to describe the relationship between patterns at different resolution levels, and for each pattern, six fuzzy membership functions were established for assigning a probability of association with a normal tissue or a pathological target. Finally, (4) a decision step exploiting the fuzzy-logic assignments selects the target class of each lung pattern among the following categories: normal (N), emphysema (EM), fibrosis/honeycombing (FHC), and ground glass (GDG). According to a preliminary evaluation on an extended database, the proposed method can overcome the drawbacks of a previously developed approach and achieve higher sensitivity and specificity.
Automated segmentations of skin, soft-tissue, and skeleton, from torso CT images
NASA Astrophysics Data System (ADS)
Zhou, Xiangrong; Hara, Takeshi; Fujita, Hiroshi; Yokoyama, Ryujiro; Kiryu, Takuji; Hoshi, Hiroaki
2004-05-01
We have been developing a computer-aided diagnosis (CAD) scheme for automatically recognizing human tissue and organ regions from high-resolution torso CT images. We show some initial results for extracting skin, soft-tissue and skeleton regions. 139 patient cases of torso CT images (male 92, female 47; age: 12-88) were used in this study. Each case was imaged with a common protocol (120kV/320mA) and covered the whole torso with isotopic spatial resolution of about 0.63 mm and density resolution of 12 bits. A gray-level thresholding based procedure was applied to separate the human body from background. The density and distance features to body surface were used to determine the skin, and separate soft-tissue from the others. A 3-D region growing based method was used to extract the skeleton. We applied this system to the 139 cases and found that the skin, soft-tissue and skeleton regions were recognized correctly for 93% of the patient cases. The accuracy of segmentation results was acceptable by evaluating the results slice by slice. This scheme will be included in CAD systems for detecting and diagnosing the abnormal lesions in multi-slice torso CT images.
Revocable identity-based proxy re-signature against signing key exposure.
Yang, Xiaodong; Chen, Chunlin; Ma, Tingchun; Wang, Jinli; Wang, Caifen
2018-01-01
Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification.
Revocable identity-based proxy re-signature against signing key exposure
Ma, Tingchun; Wang, Jinli; Wang, Caifen
2018-01-01
Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification. PMID:29579125
Towards developing Kentucky's landscape change maps
Zourarakis, D.P.; Lambert, S.C.; Palmer, M.
2003-01-01
The Kentucky Landscape Snapshot Project, a NASA-funded project, was established to provide a first baseline land cover/land use map for Kentucky. Through this endeavor, change detection will be institutionalized, thus aiding in decision-making at the local, state, and federal planning levels. 2002 Landsat 7 imaginery was classified following and Anderson Level III scheme, providing an enhancement over the 1992 USGS National Land Cover Data Set. Also as part of the deliverables, imperviousness and canopy closure layers were produced with the aid of IKONOS high resolution, multispectral imagery.
NASA Astrophysics Data System (ADS)
Wiemker, Rafael; Rogalla, Patrik; Opfer, Roland; Ekin, Ahmet; Romano, Valentina; Bülow, Thomas
2006-03-01
The performance of computer aided lung nodule detection (CAD) and computer aided nodule volumetry is compared between standard-dose (70-100 mAs) and ultra-low-dose CT images (5-10 mAs). A direct quantitative performance comparison was possible, since for each patient both an ultra-low-dose and a standard-dose CT scan were acquired within the same examination session. The data sets were recorded with a multi-slice CT scanner at the Charite university hospital Berlin with 1 mm slice thickness. Our computer aided nodule detection and segmentation algorithms were deployed on both ultra-low-dose and standard-dose CT data without any dose-specific fine-tuning or preprocessing. As a reference standard 292 nodules from 20 patients were visually identified, each nodule both in ultra-low-dose and standard-dose data sets. The CAD performance was analyzed by virtue of multiple FROC curves for different lower thresholds of the nodule diameter. For nodules with a volume-equivalent diameter equal or larger than 4 mm (149 nodules pairs), we observed a detection rate of 88% at a median false positive rate of 2 per patient in standard-dose images, and 86% detection rate in ultra-low-dose images, also at 2 FPs per patient. Including even smaller nodules equal or larger than 2 mm (272 nodules pairs), we observed a detection rate of 86% in standard-dose images, and 84% detection rate in ultra-low-dose images, both at a rate of 5 FPs per patient. Moreover, we observed a correlation of 94% between the volume-equivalent nodule diameter as automatically measured on ultra-low-dose versus on standard-dose images, indicating that ultra-low-dose CT is also feasible for growth-rate assessment in follow-up examinations. The comparable performance of lung nodule CAD in ultra-low-dose and standard-dose images is of particular interest with respect to lung cancer screening of asymptomatic patients.
Structural analysis of paintings based on brush strokes
NASA Astrophysics Data System (ADS)
Sablatnig, Robert; Kammerer, Paul; Zolda, Ernestine
1998-05-01
The origin of works of art can often not be attributed to a certain artist. Likewise it is difficult to say whether paintings or drawings are originals or forgeries. In various fields of art new technical methods are used to examine the age, the state of preservation and the origin of the materials used. For the examination of paintings, radiological methods like X-ray and infra-red diagnosis, digital radiography, computer-tomography, etc. and color analyzes are employed to authenticate art. But all these methods do not relate certain characteristics in art work to a specific artist -- the artist's personal style. In order to study this personal style of a painter, experts in art history and image processing try to examine the 'structural signature' based on brush strokes within paintings, in particular in portrait miniatures. A computer-aided classification and recognition system for portrait miniatures is developed, which enables a semi- automatic classification and forgery detection based on content, color, and brush strokes. A hierarchically structured classification scheme is introduced which separates the classification into three different levels of information: color, shape of region, and structure of brush strokes.
Evaluating Imaging and Computer-aided Detection and Diagnosis Devices at the FDA
Gallas, Brandon D.; Chan, Heang-Ping; D’Orsi, Carl J.; Dodd, Lori E.; Giger, Maryellen L.; Gur, David; Krupinski, Elizabeth A.; Metz, Charles E.; Myers, Kyle J.; Obuchowski, Nancy A.; Sahiner, Berkman; Toledano, Alicia Y.; Zuley, Margarita L.
2017-01-01
This report summarizes the Joint FDA-MIPS Workshop on Methods for the Evaluation of Imaging and Computer-Assist Devices. The purpose of the workshop was to gather information on the current state of the science and facilitate consensus development on statistical methods and study designs for the evaluation of imaging devices to support US Food and Drug Administration submissions. Additionally, participants expected to identify gaps in knowledge and unmet needs that should be addressed in future research. This summary is intended to document the topics that were discussed at the meeting and disseminate the lessons that have been learned through past studies of imaging and computer-aided detection and diagnosis device performance. PMID:22306064
Efficient Simulation of Tropical Cyclone Pathways with Stochastic Perturbations
NASA Astrophysics Data System (ADS)
Webber, R.; Plotkin, D. A.; Abbot, D. S.; Weare, J.
2017-12-01
Global Climate Models (GCMs) are known to statistically underpredict intense tropical cyclones (TCs) because they fail to capture the rapid intensification and high wind speeds characteristic of the most destructive TCs. Stochastic parametrization schemes have the potential to improve the accuracy of GCMs. However, current analysis of these schemes through direct sampling is limited by the computational expense of simulating a rare weather event at fine spatial gridding. The present work introduces a stochastically perturbed parametrization tendency (SPPT) scheme to increase simulated intensity of TCs. We adapt the Weighted Ensemble algorithm to simulate the distribution of TCs at a fraction of the computational effort required in direct sampling. We illustrate the efficiency of the SPPT scheme by comparing simulations at different spatial resolutions and stochastic parameter regimes. Stochastic parametrization and rare event sampling strategies have great potential to improve TC prediction and aid understanding of tropical cyclogenesis. Since rising sea surface temperatures are postulated to increase the intensity of TCs, these strategies can also improve predictions about climate change-related weather patterns. The rare event sampling strategies used in the current work are not only a novel tool for studying TCs, but they may also be applied to sampling any range of extreme weather events.
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin
2016-03-01
In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.
NASA Astrophysics Data System (ADS)
Costaridou, Lena
Although a wide variety of Computer-Aided Diagnosis (CADx) schemes have been proposed across breast imaging modalities, and especially in mammography, research is still ongoing to meet the high performance CADx requirements. In this chapter, methodological contributions to CADx in mammography and adjunct breast imaging modalities are reviewed, as they play a major role in early detection, diagnosis and clinical management of breast cancer. At first, basic terms and definitions are provided. Then, emphasis is given to lesion content derivation, both anatomical and functional, considering only quantitative image features of micro-calcification clusters and masses across modalities. Additionally, two CADx application examples are provided. The first example investigates the effect of segmentation accuracy on micro-calcification cluster morphology derivation in X-ray mammography. The second one demonstrates the efficiency of texture analysis in quantification of enhancement kinetics, related to vascular heterogeneity, for mass classification in dynamic contrast-enhanced magnetic resonance imaging.
Reduction of false-positive recalls using a computerized mammographic image feature analysis scheme
NASA Astrophysics Data System (ADS)
Tan, Maxine; Pu, Jiantao; Zheng, Bin
2014-08-01
The high false-positive recall rate is one of the major dilemmas that significantly reduce the efficacy of screening mammography, which harms a large fraction of women and increases healthcare cost. This study aims to investigate the feasibility of helping reduce false-positive recalls by developing a new computer-aided diagnosis (CAD) scheme based on the analysis of global mammographic texture and density features computed from four-view images. Our database includes full-field digital mammography (FFDM) images acquired from 1052 recalled women (669 positive for cancer and 383 benign). Each case has four images: two craniocaudal (CC) and two mediolateral oblique (MLO) views. Our CAD scheme first computed global texture features related to the mammographic density distribution on the segmented breast regions of four images. Second, the computed features were given to two artificial neural network (ANN) classifiers that were separately trained and tested in a ten-fold cross-validation scheme on CC and MLO view images, respectively. Finally, two ANN classification scores were combined using a new adaptive scoring fusion method that automatically determined the optimal weights to assign to both views. CAD performance was tested using the area under a receiver operating characteristic curve (AUC). The AUC = 0.793 ± 0.026 was obtained for this four-view CAD scheme, which was significantly higher at the 5% significance level than the AUCs achieved when using only CC (p = 0.025) or MLO (p = 0.0004) view images, respectively. This study demonstrates that a quantitative assessment of global mammographic image texture and density features could provide useful and/or supplementary information to classify between malignant and benign cases among the recalled cases, which may eventually help reduce the false-positive recall rate in screening mammography.
Sihong Chen; Jing Qin; Xing Ji; Baiying Lei; Tianfu Wang; Dong Ni; Jie-Zhi Cheng
2017-03-01
The gap between the computational and semantic features is the one of major factors that bottlenecks the computer-aided diagnosis (CAD) performance from clinical usage. To bridge this gap, we exploit three multi-task learning (MTL) schemes to leverage heterogeneous computational features derived from deep learning models of stacked denoising autoencoder (SDAE) and convolutional neural network (CNN), as well as hand-crafted Haar-like and HoG features, for the description of 9 semantic features for lung nodules in CT images. We regard that there may exist relations among the semantic features of "spiculation", "texture", "margin", etc., that can be explored with the MTL. The Lung Image Database Consortium (LIDC) data is adopted in this study for the rich annotation resources. The LIDC nodules were quantitatively scored w.r.t. 9 semantic features from 12 radiologists of several institutes in U.S.A. By treating each semantic feature as an individual task, the MTL schemes select and map the heterogeneous computational features toward the radiologists' ratings with cross validation evaluation schemes on the randomly selected 2400 nodules from the LIDC dataset. The experimental results suggest that the predicted semantic scores from the three MTL schemes are closer to the radiologists' ratings than the scores from single-task LASSO and elastic net regression methods. The proposed semantic attribute scoring scheme may provide richer quantitative assessments of nodules for better support of diagnostic decision and management. Meanwhile, the capability of the automatic association of medical image contents with the clinical semantic terms by our method may also assist the development of medical search engine.
Tourassi, Georgia D; Harrawood, Brian; Singh, Swatee; Lo, Joseph Y; Floyd, Carey E
2007-01-01
The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrieval precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee
The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrievalmore » precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses.« less
Vairavan, S; Ulusar, U D; Eswaran, H; Preissl, H; Wilson, J D; Mckelvey, S S; Lowery, C L; Govindan, R B
2016-02-01
We propose a novel computational approach to automatically identify the fetal heart rate patterns (fHRPs), which are reflective of sleep/awake states. By combining these patterns with presence or absence of movements, a fetal behavioral state (fBS) was determined. The expert scores were used as the gold standard and objective thresholds for the detection procedure were obtained using Receiver Operating Characteristics (ROC) analysis. To assess the performance, intraclass correlation was computed between the proposed approach and the mutually agreed expert scores. The detected fHRPs were then associated to their corresponding fBS based on the fetal movement obtained from fetal magnetocardiogaphic (fMCG) signals. This approach may aid clinicians in objectively assessing the fBS and monitoring fetal wellbeing. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini
2016-12-01
Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.
Li, Congcong; Zhang, Xi; Wang, Haiping; Li, Dongfeng
2018-01-11
Vehicular sensor networks have been widely applied in intelligent traffic systems in recent years. Because of the specificity of vehicular sensor networks, they require an enhanced, secure and efficient authentication scheme. Existing authentication protocols are vulnerable to some problems, such as a high computational overhead with certificate distribution and revocation, strong reliance on tamper-proof devices, limited scalability when building many secure channels, and an inability to detect hardware tampering attacks. In this paper, an improved authentication scheme using certificateless public key cryptography is proposed to address these problems. A security analysis of our scheme shows that our protocol provides an enhanced secure anonymous authentication, which is resilient against major security threats. Furthermore, the proposed scheme reduces the incidence of node compromise and replication attacks. The scheme also provides a malicious-node detection and warning mechanism, which can quickly identify compromised static nodes and immediately alert the administrative department. With performance evaluations, the scheme can obtain better trade-offs between security and efficiency than the well-known available schemes.
Tartar, A; Akan, A; Kilic, N
2014-01-01
Computer-aided detection systems can help radiologists to detect pulmonary nodules at an early stage. In this paper, a novel Computer-Aided Diagnosis system (CAD) is proposed for the classification of pulmonary nodules as malignant and benign. The proposed CAD system using ensemble learning classifiers, provides an important support to radiologists at the diagnosis process of the disease, achieves high classification performance. The proposed approach with bagging classifier results in 94.7 %, 90.0 % and 77.8 % classification sensitivities for benign, malignant and undetermined classes (89.5 % accuracy), respectively.
NASA Astrophysics Data System (ADS)
Qiu, Junchao; Zhang, Lin; Li, Diyang; Liu, Xingcheng
2016-06-01
Chaotic sequences can be applied to realize multiple user access and improve the system security for a visible light communication (VLC) system. However, since the map patterns of chaotic sequences are usually well known, eavesdroppers can possibly derive the key parameters of chaotic sequences and subsequently retrieve the information. We design an advanced encryption standard (AES) interleaving aided multiple user access scheme to enhance the security of a chaotic code division multiple access-based visible light communication (C-CDMA-VLC) system. We propose to spread the information with chaotic sequences, and then the spread information is interleaved by an AES algorithm and transmitted over VLC channels. Since the computation complexity of performing inverse operations to deinterleave the information is high, the eavesdroppers in a high speed VLC system cannot retrieve the information in real time; thus, the system security will be enhanced. Moreover, we build a mathematical model for the AES-aided VLC system and derive the theoretical information leakage to analyze the system security. The simulations are performed over VLC channels, and the results demonstrate the effectiveness and high security of our presented AES interleaving aided chaotic CDMA-VLC system.
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Stoug, Rebecca G.; Pearce, Melanie; Liu, Hong; Zheng, Bin
2018-03-01
Although breast magnetic resonance imaging (MRI) has been used as a breast cancer screening modality for high-risk women, its cancer detection yield remains low (i.e., <= 3%). Thus, increasing breast MRI screening efficacy and cancer detection yield is an important clinical issue in breast cancer screening. In this study, we investigated association between the background parenchymal enhancement (BPE) of breast MRI and the change of diagnostic (BIRADS) status in the next subsequent breast MRI screening. A dataset with 65 breast MRI screening cases was retrospectively assembled. All cases were rated BIRADS-2 (benign findings). In the subsequent screening, 4 cases were malignant (BIRADS-6), 48 remained BIRADS-2 and 13 were downgraded to negative (BIRADS-1). A computer-aided detection scheme was applied to process images of the first set of breast MRI screening. Total of 33 features were computed including texture feature and global BPE features. Texture features were computed from either a gray-level co-occurrence matrix or a gray level run length matrix. Ten global BPE features were also initially computed from two breast regions and bilateral difference between the left and right breasts. Box-plot based analysis shows positive association between texture features and BIRADS rating levels in the second screening. Furthermore, a logistic regression model was built using optimal features selected by a CFS based feature selection method. Using a leave-one-case-out based cross-validation method, classification yielded an overall 75% accuracy in predicting the improvement (or downgrade) of diagnostic status (to BIRAD-1) in the subsequent breast MRI screening. This study demonstrated potential of developing a new quantitative imaging marker to predict diagnostic status change in the short-term, which may help eliminate a high fraction of unnecessary repeated breast MRI screenings and increase the cancer detection yield.
Evaluation of hardware costs of implementing PSK signal detection circuit based on "system on chip"
NASA Astrophysics Data System (ADS)
Sokolovskiy, A. V.; Dmitriev, D. D.; Veisov, E. A.; Gladyshev, A. B.
2018-05-01
The article deals with the choice of the architecture of digital signal processing units for implementing the PSK signal detection scheme. As an assessment of the effectiveness of architectures, the required number of shift registers and computational processes are used when implementing the "system on a chip" on the chip. A statistical estimation of the normalized code sequence offset in the signal synchronization scheme for various hardware block architectures is used.
NASA Technical Reports Server (NTRS)
Reed, M. A.
1974-01-01
The need for an obstacle detection system on the Mars roving vehicle was assumed, and a practical scheme was investigated and simulated. The principal sensing device on this vehicle was taken to be a laser range finder. Both existing and original algorithms, ending with thresholding operations, were used to obtain the outlines of obstacles from the raw data of this laser scan. A theoretical analysis was carried out to show how proper value of threshold may be chosen. Computer simulations considered various mid-range boulders, for which the scheme was quite successful. The extension to other types of obstacles, such as craters, was considered. The special problems of bottom edge detection and scanning procedure are discussed.
Efficient seeding and defragmentation of curvature streamlines for colonic polyp detection
NASA Astrophysics Data System (ADS)
Zhao, Lingxiao; Botha, Charl P.; Truyen, Roel; Vos, Frans M.; Post, Frits H.
2008-03-01
Many computer aided diagnosis (CAD) schemes have been developed for colon cancer detection using Virtual Colonoscopy (VC). In earlier work, we developed an automatic polyp detection method integrating flow visualization techniques, that forms part of the CAD functionality of an existing Virtual Colonoscopy pipeline. Curvature streamlines were used to characterize polyp surface shape. Features derived from curvature streamlines correlated highly with true polyp detections. During testing with a large number of patient data sets, we found that the correlation between streamline features and true polyps could be affected by noise and our streamline generation technique. The seeding and spacing constraints and CT noise could lead to streamline fragmentation, which reduced the discriminating power of our streamline features. In this paper, we present two major improvements of our curvature streamline generation. First, we adapted our streamline seeding strategy to the local surface properties and made the streamline generation faster. It generates a significantly smaller number of seeds but still results in a comparable and suitable streamline distribution. Second, based on our observation that longer streamlines are better surface shape descriptors, we improved our streamline tracing algorithm to produce longer streamlines. Our improved techniques are more effcient and also guide the streamline geometry to correspond better to colonic surface shape. These two adaptations support a robust and high correlation between our streamline features and true positive detections and lead to better polyp detection results.
NASA Astrophysics Data System (ADS)
Chen, Shih-Hao; Chow, Chi-Wai
2015-01-01
Multiple-input and multiple-output (MIMO) scheme can extend the transmission capacity for the light-emitting-diode (LED) based visible light communication (VLC) systems. The MIMO VLC system that uses the mobile-phone camera as the optical receiver (Rx) to receive MIMO signal from the n×n Red-Green-Blue (RGB) LED array is desirable. The key step of decoding this signal is to detect the signal direction. If the LED transmitter (Tx) is rotated, the Rx may not realize the rotation and transmission error can occur. In this work, we propose and demonstrate a novel hierarchical transmission scheme which can reduce the computation complexity of rotation detection in LED array VLC system. We use the n×n RGB LED array as the MIMO Tx. In our study, a novel two dimensional Hadamard coding scheme is proposed. Using the different LED color layers to indicate the rotation, a low complexity rotation detection method can be used for improving the quality of received signal. The detection correction rate is above 95% in the indoor usage distance. Experimental results confirm the feasibility of the proposed scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bing; Tian, Xuedong; Wang, Qian
Purpose: Accurate detection of pulmonary nodules remains a technical challenge in computer-aided diagnosis systems because some nodules may adhere to the blood vessels or the lung wall, which have low contrast compared to the surrounding tissues. In this paper, the analysis of typical shape features of candidate nodules based on a shape constraint Chan–Vese (CV) model combined with calculation of the number of blood branches adhered to nodule candidates is proposed to reduce false positive (FP) nodules from candidate nodules. Methods: The proposed scheme consists of three major stages: (1) Segmentation of lung parenchyma from computed tomography images. (2) Extractionmore » of candidate nodules. (3) Reduction of FP nodules. A gray level enhancement combined with a spherical shape enhancement filter is introduced to extract the candidate nodules and their sphere-like contour regions. FPs are removed by analysis of the typical shape features of nodule candidates based on the CV model using spherical constraint and by investigating the number of blood branches adhered to the candidate nodules. The constrained shapes of CV model are automatically achieved from the extracted candidate nodules. Results: The detection performance was evaluated on 127 nodules of 103 cases including three types of challenging nodules, which are juxta-pleural nodules, juxta-vascular nodules, and ground glass opacity nodules. The free-receiver operating characteristic (FROC) curve shows that the proposed method is able to detect 88% of all the nodules in the data set with 4 FPs per case. Conclusions: Evaluation shows that the authors’ method is feasible and effective for detection of three types of nodules in this study.« less
Computer Aided Synthesis or Measurement Schemes for Telemetry applications
1997-09-02
5.2.5. Frame structure generation The algorithm generating the frame structure should take as inputs the sampling frequency requirements of the channels...these channels into the frame structure. Generally there can be a lot of ways to divide channels among groups. The algorithm implemented in...groups) first. The algorithm uses the function "try_permutation" recursively to distribute channels among the groups, and the function "try_subtable
Research on computer-aided design of modern marine power systems
NASA Astrophysics Data System (ADS)
Ding, Dongdong; Zeng, Fanming; Chen, Guojun
2004-03-01
To make the MPS (Marine Power System) design process more economical and easier, a new CAD scheme is brought forward which takes much advantage of VR (Virtual Reality) and AI (Artificial Intelligence) technologies. This CAD system can shorten the period of design and reduce the requirements on designers' experience in large scale. And some key issues like the selection of hardware and software of such a system are discussed.
Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter
NASA Astrophysics Data System (ADS)
Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi
2013-03-01
Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.
Wu, Yu-Hsiang; Stangl, Elizabeth
2013-01-01
The acceptable noise level (ANL) test determines the maximum noise level that an individual is willing to accept while listening to speech. The first objective of the present study was to systematically investigate the effect of wide dynamic range compression processing (WDRC), and its combined effect with digital noise reduction (DNR) and directional processing (DIR), on ANL. Because ANL represents the lowest signal-to-noise ratio (SNR) that a listener is willing to accept, the second objective was to examine whether the hearing aid output SNR could predict aided ANL across different combinations of hearing aid signal-processing schemes. Twenty-five adults with sensorineural hearing loss participated in the study. ANL was measured monaurally in two unaided and seven aided conditions, in which the status of the hearing aid processing schemes (enabled or disabled) and the location of noise (front or rear) were manipulated. The hearing aid output SNR was measured for each listener in each condition using a phase-inversion technique. The aided ANL was predicted by unaided ANL and hearing aid output SNR, under the assumption that the lowest acceptable SNR at the listener's eardrum is a constant across different ANL test conditions. Study results revealed that, on average, WDRC increased (worsened) ANL by 1.5 dB, while DNR and DIR decreased (improved) ANL by 1.1 and 2.8 dB, respectively. Because the effects of WDRC and DNR on ANL were opposite in direction but similar in magnitude, the ANL of linear/DNR-off was not significantly different from that of WDRC/DNR-on. The results further indicated that the pattern of ANL change across different aided conditions was consistent with the pattern of hearing aid output SNR change created by processing schemes. Compared with linear processing, WDRC creates a noisier sound image and makes listeners less willing to accept noise. However, this negative effect on noise acceptance can be offset by DNR, regardless of microphone mode. The hearing aid output SNR derived using the phase-inversion technique can predict aided ANL across different combinations of signal-processing schemes. These results suggest a close relationship between aided ANL, signal-processing scheme, and hearing aid output SNR.
NASA Astrophysics Data System (ADS)
Mostapha, Mahmoud; Khalifa, Fahmi; Alansary, Amir; Soliman, Ahmed; Gimel'farb, Georgy; El-Baz, Ayman
2013-10-01
Early detection of renal transplant rejection is important to implement appropriate medical and immune therapy in patients with transplanted kidneys. In literature, a large number of computer-aided diagnostic (CAD) systems using different image modalities, such as ultrasound (US), magnetic resonance imaging (MRI), computed tomography (CT), and radionuclide imaging, have been proposed for early detection of kidney diseases. A typical CAD system for kidney diagnosis consists of a set of processing steps including: motion correction, segmentation of the kidney and/or its internal structures (e.g., cortex, medulla), construction of agent kinetic curves, functional parameter estimation, diagnosis, and assessment of the kidney status. In this paper, we survey the current state-of-the-art CAD systems that have been developed for kidney disease diagnosis using dynamic MRI. In addition, the paper addresses several challenges that researchers face in developing efficient, fast and reliable CAD systems for the early detection of kidney diseases.
Computer-Aided Recognition of Facial Attributes for Fetal Alcohol Spectrum Disorders.
Valentine, Matthew; Bihm, Dustin C J; Wolf, Lior; Hoyme, H Eugene; May, Philip A; Buckley, David; Kalberg, Wendy; Abdul-Rahman, Omar A
2017-12-01
To compare the detection of facial attributes by computer-based facial recognition software of 2-D images against standard, manual examination in fetal alcohol spectrum disorders (FASD). Participants were gathered from the Fetal Alcohol Syndrome Epidemiology Research database. Standard frontal and oblique photographs of children were obtained during a manual, in-person dysmorphology assessment. Images were submitted for facial analysis conducted by the facial dysmorphology novel analysis technology (an automated system), which assesses ratios of measurements between various facial landmarks to determine the presence of dysmorphic features. Manual blinded dysmorphology assessments were compared with those obtained via the computer-aided system. Areas under the curve values for individual receiver-operating characteristic curves revealed the computer-aided system (0.88 ± 0.02) to be comparable to the manual method (0.86 ± 0.03) in detecting patients with FASD. Interestingly, cases of alcohol-related neurodevelopmental disorder (ARND) were identified more efficiently by the computer-aided system (0.84 ± 0.07) in comparison to the manual method (0.74 ± 0.04). A facial gestalt analysis of patients with ARND also identified more generalized facial findings compared to the cardinal facial features seen in more severe forms of FASD. We found there was an increased diagnostic accuracy for ARND via our computer-aided method. As this category has been historically difficult to diagnose, we believe our experiment demonstrates that facial dysmorphology novel analysis technology can potentially improve ARND diagnosis by introducing a standardized metric for recognizing FASD-associated facial anomalies. Earlier recognition of these patients will lead to earlier intervention with improved patient outcomes. Copyright © 2017 by the American Academy of Pediatrics.
Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors
NASA Technical Reports Server (NTRS)
Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.
2011-01-01
This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.
NASA Astrophysics Data System (ADS)
Ciany, Charles M.; Zurawski, William; Kerfoot, Ian
2001-10-01
The performance of Computer Aided Detection/Computer Aided Classification (CAD/CAC) Fusion algorithms on side-scan sonar images was evaluated using data taken at the Navy's's Fleet Battle Exercise-Hotel held in Panama City, Florida, in August 2000. A 2-of-3 binary fusion algorithm is shown to provide robust performance. The algorithm accepts the classification decisions and associated contact locations form three different CAD/CAC algorithms, clusters the contacts based on Euclidian distance, and then declares a valid target when a clustered contact is declared by at least 2 of the 3 individual algorithms. This simple binary fusion provided a 96 percent probability of correct classification at a false alarm rate of 0.14 false alarms per image per side. The performance represented a 3.8:1 reduction in false alarms over the best performing single CAD/CAC algorithm, with no loss in probability of correct classification.
Quantum Iterative Deepening with an Application to the Halting Problem
Tarrataca, Luís; Wichert, Andreas
2013-01-01
Classical models of computation traditionally resort to halting schemes in order to enquire about the state of a computation. In such schemes, a computational process is responsible for signaling an end of a calculation by setting a halt bit, which needs to be systematically checked by an observer. The capacity of quantum computational models to operate on a superposition of states requires an alternative approach. From a quantum perspective, any measurement of an equivalent halt qubit would have the potential to inherently interfere with the computation by provoking a random collapse amongst the states. This issue is exacerbated by undecidable problems such as the Entscheidungsproblem which require universal computational models, e.g. the classical Turing machine, to be able to proceed indefinitely. In this work we present an alternative view of quantum computation based on production system theory in conjunction with Grover's amplitude amplification scheme that allows for (1) a detection of halt states without interfering with the final result of a computation; (2) the possibility of non-terminating computation and (3) an inherent speedup to occur during computations susceptible of parallelization. We discuss how such a strategy can be employed in order to simulate classical Turing machines. PMID:23520465
Wang, Shuihua; Yang, Ming; Du, Sidan; Yang, Jiquan; Liu, Bin; Gorriz, Juan M.; Ramírez, Javier; Yuan, Ti-Fei; Zhang, Yudong
2016-01-01
Highlights We develop computer-aided diagnosis system for unilateral hearing loss detection in structural magnetic resonance imaging.Wavelet entropy is introduced to extract image global features from brain images. Directed acyclic graph is employed to endow support vector machine an ability to handle multi-class problems.The developed computer-aided diagnosis system achieves an overall accuracy of 95.1% for this three-class problem of differentiating left-sided and right-sided hearing loss from healthy controls. Aim: Sensorineural hearing loss (SNHL) is correlated to many neurodegenerative disease. Now more and more computer vision based methods are using to detect it in an automatic way. Materials: We have in total 49 subjects, scanned by 3.0T MRI (Siemens Medical Solutions, Erlangen, Germany). The subjects contain 14 patients with right-sided hearing loss (RHL), 15 patients with left-sided hearing loss (LHL), and 20 healthy controls (HC). Method: We treat this as a three-class classification problem: RHL, LHL, and HC. Wavelet entropy (WE) was selected from the magnetic resonance images of each subjects, and then submitted to a directed acyclic graph support vector machine (DAG-SVM). Results: The 10 repetition results of 10-fold cross validation shows 3-level decomposition will yield an overall accuracy of 95.10% for this three-class classification problem, higher than feedforward neural network, decision tree, and naive Bayesian classifier. Conclusions: This computer-aided diagnosis system is promising. We hope this study can attract more computer vision method for detecting hearing loss. PMID:27807415
Wang, Shuihua; Yang, Ming; Du, Sidan; Yang, Jiquan; Liu, Bin; Gorriz, Juan M; Ramírez, Javier; Yuan, Ti-Fei; Zhang, Yudong
2016-01-01
Highlights We develop computer-aided diagnosis system for unilateral hearing loss detection in structural magnetic resonance imaging.Wavelet entropy is introduced to extract image global features from brain images. Directed acyclic graph is employed to endow support vector machine an ability to handle multi-class problems.The developed computer-aided diagnosis system achieves an overall accuracy of 95.1% for this three-class problem of differentiating left-sided and right-sided hearing loss from healthy controls. Aim: Sensorineural hearing loss (SNHL) is correlated to many neurodegenerative disease. Now more and more computer vision based methods are using to detect it in an automatic way. Materials: We have in total 49 subjects, scanned by 3.0T MRI (Siemens Medical Solutions, Erlangen, Germany). The subjects contain 14 patients with right-sided hearing loss (RHL), 15 patients with left-sided hearing loss (LHL), and 20 healthy controls (HC). Method: We treat this as a three-class classification problem: RHL, LHL, and HC. Wavelet entropy (WE) was selected from the magnetic resonance images of each subjects, and then submitted to a directed acyclic graph support vector machine (DAG-SVM). Results: The 10 repetition results of 10-fold cross validation shows 3-level decomposition will yield an overall accuracy of 95.10% for this three-class classification problem, higher than feedforward neural network, decision tree, and naive Bayesian classifier. Conclusions: This computer-aided diagnosis system is promising. We hope this study can attract more computer vision method for detecting hearing loss.
NASA Astrophysics Data System (ADS)
Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin
2015-03-01
The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.
NASA Astrophysics Data System (ADS)
Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan
2015-10-01
In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.
A soft-hard combination-based cooperative spectrum sensing scheme for cognitive radio networks.
Do, Nhu Tri; An, Beongku
2015-02-13
In this paper we propose a soft-hard combination scheme, called SHC scheme, for cooperative spectrum sensing in cognitive radio networks. The SHC scheme deploys a cluster based network in which Likelihood Ratio Test (LRT)-based soft combination is applied at each cluster, and weighted decision fusion rule-based hard combination is utilized at the fusion center. The novelties of the SHC scheme are as follows: the structure of the SHC scheme reduces the complexity of cooperative detection which is an inherent limitation of soft combination schemes. By using the LRT, we can detect primary signals in a low signal-to-noise ratio regime (around an average of -15 dB). In addition, the computational complexity of the LRT is reduced since we derive the closed-form expression of the probability density function of LRT value. The SHC scheme also takes into account the different effects of large scale fading on different users in the wide area network. The simulation results show that the SHC scheme not only provides the better sensing performance compared to the conventional hard combination schemes, but also reduces sensing overhead in terms of reporting time compared to the conventional soft combination scheme using the LRT.
Welter, Petra; Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno, Thomas M
2011-01-01
It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process.
Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno (né Lehmann), Thomas M
2011-01-01
It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process. PMID:21672913
An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.
Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K
2007-08-01
To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.
Computer-Aided Detection of Mammographic Masses in Dense Breast Images
2005-06-01
Kinnard, Ph.D. CONTRACTING ORGANIZATION: Howard University Washington, DC 20059 REPORT DATE: June 2005 TYPE OF REPORT: Annual Summary PREPARED FOR: U.S...AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Howard University Washington, DC 20059 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES...34, Preparing for the Postdoctoral Institute, August, 2004, Howard University and The University of Texas at El Paso. 2. "Computer-Aided Diagnosis and Image
Irvine, Michael A; Hollingsworth, T Déirdre
2018-05-26
Fitting complex models to epidemiological data is a challenging problem: methodologies can be inaccessible to all but specialists, there may be challenges in adequately describing uncertainty in model fitting, the complex models may take a long time to run, and it can be difficult to fully capture the heterogeneity in the data. We develop an adaptive approximate Bayesian computation scheme to fit a variety of epidemiologically relevant data with minimal hyper-parameter tuning by using an adaptive tolerance scheme. We implement a novel kernel density estimation scheme to capture both dispersed and multi-dimensional data, and directly compare this technique to standard Bayesian approaches. We then apply the procedure to a complex individual-based simulation of lymphatic filariasis, a human parasitic disease. The procedure and examples are released alongside this article as an open access library, with examples to aid researchers to rapidly fit models to data. This demonstrates that an adaptive ABC scheme with a general summary and distance metric is capable of performing model fitting for a variety of epidemiological data. It also does not require significant theoretical background to use and can be made accessible to the diverse epidemiological research community. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Alam, Daniel; Ali, Yaseen; Klem, Christopher; Coventry, Daniel
2016-11-01
Orbito-malar reconstruction after oncological resection represents one of the most challenging facial reconstructive procedures. Until the last few decades, rehabilitation was typically prosthesis based with a limited role for surgery. The advent of microsurgical techniques allowed large-volume tissue reconstitution from a distant donor site, revolutionizing the potential approaches to these defects. The authors report a novel surgery-based algorithm and a classification scheme for complete midface reconstruction with a foundation in the Gillies principles of like-to-like reconstruction and with a significant role of computer-aided virtual planning. With this approach, the authors have been able to achieve significantly better patient outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.
Ren, Fulong; Cao, Peng; Li, Wei; Zhao, Dazhe; Zaiane, Osmar
2017-01-01
Diabetic retinopathy (DR) is a progressive disease, and its detection at an early stage is crucial for saving a patient's vision. An automated screening system for DR can help in reduce the chances of complete blindness due to DR along with lowering the work load on ophthalmologists. Among the earliest signs of DR are microaneurysms (MAs). However, current schemes for MA detection appear to report many false positives because detection algorithms have high sensitivity. Inevitably some non-MAs structures are labeled as MAs in the initial MAs identification step. This is a typical "class imbalance problem". Class imbalanced data has detrimental effects on the performance of conventional classifiers. In this work, we propose an ensemble based adaptive over-sampling algorithm for overcoming the class imbalance problem in the false positive reduction, and we use Boosting, Bagging, Random subspace as the ensemble framework to improve microaneurysm detection. The ensemble based over-sampling methods we proposed combine the strength of adaptive over-sampling and ensemble. The objective of the amalgamation of ensemble and adaptive over-sampling is to reduce the induction biases introduced from imbalanced data and to enhance the generalization classification performance of extreme learning machines (ELM). Experimental results show that our ASOBoost method has higher area under the ROC curve (AUC) and G-mean values than many existing class imbalance learning methods. Copyright © 2016 Elsevier Ltd. All rights reserved.
Convolution neural-network-based detection of lung structures
NASA Astrophysics Data System (ADS)
Hasegawa, Akira; Lo, Shih-Chung B.; Freedman, Matthew T.; Mun, Seong K.
1994-05-01
Chest radiography is one of the most primary and widely used techniques in diagnostic imaging. Nowadays with the advent of digital radiology, the digital medical image processing techniques for digital chest radiographs have attracted considerable attention, and several studies on the computer-aided diagnosis (CADx) as well as on the conventional image processing techniques for chest radiographs have been reported. In the automatic diagnostic process for chest radiographs, it is important to outline the areas of the lungs, the heart, and the diaphragm. This is because the original chest radiograph is composed of important anatomic structures and, without knowing exact positions of the organs, the automatic diagnosis may result in unexpected detections. The automatic extraction of an anatomical structure from digital chest radiographs can be a useful tool for (1) the evaluation of heart size, (2) automatic detection of interstitial lung diseases, (3) automatic detection of lung nodules, and (4) data compression, etc. Based on the clearly defined boundaries of heart area, rib spaces, rib positions, and rib cage extracted, one should be able to use this information to facilitate the tasks of the CADx on chest radiographs. In this paper, we present an automatic scheme for the detection of lung field from chest radiographs by using a shift-invariant convolution neural network. A novel algorithm for smoothing boundaries of lungs is also presented.
Mission Management Computer and Sequencing Hardware for RLV-TD HEX-01 Mission
NASA Astrophysics Data System (ADS)
Gupta, Sukrat; Raj, Remya; Mathew, Asha Mary; Koshy, Anna Priya; Paramasivam, R.; Mookiah, T.
2017-12-01
Reusable Launch Vehicle-Technology Demonstrator Hypersonic Experiment (RLV-TD HEX-01) mission posed some unique challenges in the design and development of avionics hardware. This work presents the details of mission critical avionics hardware mainly Mission Management Computer (MMC) and sequencing hardware. The Navigation, Guidance and Control (NGC) chain for RLV-TD is dual redundant with cross-strapped Remote Terminals (RTs) interfaced through MIL-STD-1553B bus. MMC is Bus Controller on the 1553 bus, which does the function of GPS aided navigation, guidance, digital autopilot and sequencing for the RLV-TD launch vehicle in different periodicities (10, 20, 500 ms). Digital autopilot execution in MMC with a periodicity of 10 ms (in ascent phase) is introduced for the first time and successfully demonstrated in the flight. MMC is built around Intel i960 processor and has inbuilt fault tolerance features like ECC for memories. Fault Detection and Isolation schemes are implemented to isolate the failed MMC. The sequencing hardware comprises Stage Processing System (SPS) and Command Execution Module (CEM). SPS is `RT' on the 1553 bus which receives the sequencing and control related commands from MMCs and posts to downstream modules after proper error handling for final execution. SPS is designed as a high reliability system by incorporating various fault tolerance and fault detection features. CEM is a relay based module for sequence command execution.
Program for computer aided reliability estimation
NASA Technical Reports Server (NTRS)
Mathur, F. P. (Inventor)
1972-01-01
A computer program for estimating the reliability of self-repair and fault-tolerant systems with respect to selected system and mission parameters is presented. The computer program is capable of operation in an interactive conversational mode as well as in a batch mode and is characterized by maintenance of several general equations representative of basic redundancy schemes in an equation repository. Selected reliability functions applicable to any mathematical model formulated with the general equations, used singly or in combination with each other, are separately stored. One or more system and/or mission parameters may be designated as a variable. Data in the form of values for selected reliability functions is generated in a tabular or graphic format for each formulated model.
Computer-aided diagnosis of leukoencephalopathy in children treated for acute lymphoblastic leukemia
NASA Astrophysics Data System (ADS)
Glass, John O.; Li, Chin-Shang; Helton, Kathleen J.; Reddick, Wilburn E.
2005-04-01
The purpose of this study was to use objective quantitative MR imaging methods to develop a computer-aided diagnosis tool to differentiate white matter (WM) hyperintensities as either leukoencephalopathy (LE) or normal maturational processes in children treated for acute lymphoblastic leukemia with intravenous high dose methotrexate. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and WM, gray matter, and cerebrospinal fluid a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map. Segmented regions were manually classified to identify the most hyperintense WM region and the normal appearing genu region. Signal intensity differences normalized to the genu within each examination were generated for two time points in 203 children. An unsupervised hierarchical clustering algorithm with the agglomeration method of McQuitty was used to divide data from the first examination into normal appearing or LE groups. A C-support vector machine (C-SVM) was then trained on the first examination data and used to classify the data from the second examination. The overall accuracy of the computer-aided detection tool was 83.5% (299/358) with sensitivity to normal WM of 86.9% (199/229) and specificity to LE of 77.5% (100/129) when compared to the readings of two expert observers. These results suggest that subtle therapy-induced leukoencephalopathy can be objectively and reproducibly detected in children treated for cancer using this computer-aided detection approach based on relative differences in quantitative signal intensity measures normalized within each examination.
Li, Congcong; Zhang, Xi; Wang, Haiping; Li, Dongfeng
2018-01-01
Vehicular sensor networks have been widely applied in intelligent traffic systems in recent years. Because of the specificity of vehicular sensor networks, they require an enhanced, secure and efficient authentication scheme. Existing authentication protocols are vulnerable to some problems, such as a high computational overhead with certificate distribution and revocation, strong reliance on tamper-proof devices, limited scalability when building many secure channels, and an inability to detect hardware tampering attacks. In this paper, an improved authentication scheme using certificateless public key cryptography is proposed to address these problems. A security analysis of our scheme shows that our protocol provides an enhanced secure anonymous authentication, which is resilient against major security threats. Furthermore, the proposed scheme reduces the incidence of node compromise and replication attacks. The scheme also provides a malicious-node detection and warning mechanism, which can quickly identify compromised static nodes and immediately alert the administrative department. With performance evaluations, the scheme can obtain better trade-offs between security and efficiency than the well-known available schemes. PMID:29324719
Analysis and control of supersonic vortex breakdown flows
NASA Technical Reports Server (NTRS)
Kandil, Osama A.
1990-01-01
Analysis and computation of steady, compressible, quasi-axisymmetric flow of an isolated, slender vortex are considered. The compressible, Navier-Stokes equations are reduced to a simpler set by using the slenderness and quasi-axisymmetry assumptions. The resulting set along with a compatibility equation are transformed from the diverging physical domain to a rectangular computational domain. Solving for a compatible set of initial profiles and specifying a compatible set of boundary conditions, the equations are solved using a type-differencing scheme. Vortex breakdown locations are detected by the failure of the scheme to converge. Computational examples include isolated vortex flows at different Mach numbers, external axial-pressure gradients and swirl ratios.
Computer-Aided Engineering of Semiconductor Integrated Circuits
1979-07-01
equation using a five point finite difference approximation. Section 4.3.6 describes the numerical techniques and iterative algorithms which are used...neighbor points. This is generally referred to as a five point finite difference scheme on a rectangular grid, as described below. The finite difference ...problems in steady state have been analyzed by the finite difference method [4. 16 ] [4.17 3 or finite element method [4. 18 3, [4. 19 3 as reported last
NASA Astrophysics Data System (ADS)
He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin; Su, Jinshu
2015-01-01
To improve the transmission performance of multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband (UWB) over optical fiber, a pre-coding scheme based on low-density parity-check (LDPC) is adopted and experimentally demonstrated in the intensity-modulation and direct-detection MB-OFDM UWB over fiber system. Meanwhile, a symbol synchronization and pilot-aided channel estimation scheme is implemented on the receiver of the MB-OFDM UWB over fiber system. The experimental results show that the LDPC pre-coding scheme can work effectively in the MB-OFDM UWB over fiber system. After 70 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1 × 10-3, the receiver sensitivities are improved about 4 dB when the LDPC code rate is 75%.
Song, Min Su; Lee, Jae Dong; Jeong, Young-Sik; Jeong, Hwa-Young; Park, Jong Hyuk
2014-01-01
Despite the convenience, ubiquitous computing suffers from many threats and security risks. Security considerations in the ubiquitous network are required to create enriched and more secure ubiquitous environments. The address resolution protocol (ARP) is a protocol used to identify the IP address and the physical address of the associated network card. ARP is designed to work without problems in general environments. However, since it does not include security measures against malicious attacks, in its design, an attacker can impersonate another host using ARP spoofing or access important information. In this paper, we propose a new detection scheme for ARP spoofing attacks using a routing trace, which can be used to protect the internal network. Tracing routing can find the change of network movement path. The proposed scheme provides high constancy and compatibility because it does not alter the ARP protocol. In addition, it is simple and stable, as it does not use a complex algorithm or impose extra load on the computer system.
Song, Min Su; Lee, Jae Dong; Jeong, Hwa-Young; Park, Jong Hyuk
2014-01-01
Despite the convenience, ubiquitous computing suffers from many threats and security risks. Security considerations in the ubiquitous network are required to create enriched and more secure ubiquitous environments. The address resolution protocol (ARP) is a protocol used to identify the IP address and the physical address of the associated network card. ARP is designed to work without problems in general environments. However, since it does not include security measures against malicious attacks, in its design, an attacker can impersonate another host using ARP spoofing or access important information. In this paper, we propose a new detection scheme for ARP spoofing attacks using a routing trace, which can be used to protect the internal network. Tracing routing can find the change of network movement path. The proposed scheme provides high constancy and compatibility because it does not alter the ARP protocol. In addition, it is simple and stable, as it does not use a complex algorithm or impose extra load on the computer system. PMID:25243205
Blackmon, Kevin N; Florin, Charles; Bogoni, Luca; McCain, Joshua W; Koonce, James D; Lee, Heon; Bastarrika, Gorka; Thilo, Christian; Costello, Philip; Salganicoff, Marcos; Joseph Schoepf, U
2011-06-01
To evaluate the effect of a computer-aided detection (CAD) algorithm on the performance of novice readers for detection of pulmonary embolism (PE) at CT pulmonary angiography (CTPA). We included CTPA examinations of 79 patients (50 female, 52 ± 18 years). Studies were evaluated by two independent inexperienced readers who marked all vessels containing PE. After 3 months all studies were reevaluated by the same two readers, this time aided by CAD prototype. A consensus read by three expert radiologists served as the reference standard. Statistical analysis used χ(2) and McNemar testing. Expert consensus revealed 119 PEs in 32 studies. For PE detection, the sensitivity of CAD alone was 78%. Inexperienced readers' initial interpretations had an average per-PE sensitivity of 50%, which improved to 71% (p < 0.001) with CAD as a second reader. False positives increased from 0.18 to 0.25 per study (p = 0.03). Per-study, the readers initially detected 27/32 positive studies (84%); with CAD this number increased to 29.5 studies (92%; p = 0.125). Our results suggest that CAD significantly improves the sensitivity of PE detection for inexperienced readers with a small but appreciable increase in the rate of false positives.
Pereira, Danilo Cesar; Ramos, Rodrigo Pereira; do Nascimento, Marcelo Zanchetta
2014-04-01
In Brazil, the National Cancer Institute (INCA) reports more than 50,000 new cases of the disease, with risk of 51 cases per 100,000 women. Radiographic images obtained from mammography equipments are one of the most frequently used techniques for helping in early diagnosis. Due to factors related to cost and professional experience, in the last two decades computer systems to support detection (Computer-Aided Detection - CADe) and diagnosis (Computer-Aided Diagnosis - CADx) have been developed in order to assist experts in detection of abnormalities in their initial stages. Despite the large number of researches on CADe and CADx systems, there is still a need for improved computerized methods. Nowadays, there is a growing concern with the sensitivity and reliability of abnormalities diagnosis in both views of breast mammographic images, namely cranio-caudal (CC) and medio-lateral oblique (MLO). This paper presents a set of computational tools to aid segmentation and detection of mammograms that contained mass or masses in CC and MLO views. An artifact removal algorithm is first implemented followed by an image denoising and gray-level enhancement method based on wavelet transform and Wiener filter. Finally, a method for detection and segmentation of masses using multiple thresholding, wavelet transform and genetic algorithm is employed in mammograms which were randomly selected from the Digital Database for Screening Mammography (DDSM). The developed computer method was quantitatively evaluated using the area overlap metric (AOM). The mean ± standard deviation value of AOM for the proposed method was 79.2 ± 8%. The experiments demonstrate that the proposed method has a strong potential to be used as the basis for mammogram mass segmentation in CC and MLO views. Another important aspect is that the method overcomes the limitation of analyzing only CC and MLO views. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Deep Learning for Computer Vision: A Brief Review
Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
2018-01-01
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619
Pre-compensation combined with TS-aided and ISFA-enhanced scheme for UWB system
NASA Astrophysics Data System (ADS)
He, Jing; Xiang, Changqing; Long, Fengting; Wu, Kaiquan; Chen, Lin
2017-08-01
In this paper, a pre-compensation combined with training sequence (TS)-aided and intra-symbol frequency-domain averaging (ISFA)-enhanced scheme is proposed to improve the transmission performance in 64-quadrature amplitude modulation multiband orthogonal-frequency-division-multiplexing ultra-wide band over fiber (64QAM MB-OFDM UWBoF) system. We theoretically analyze and experimentally demonstrate that the proposed scheme is suitable for the 64QAM MB-OFDM UWBoF system in contrast with two other cases: (I) only pilot-aided channel estimation and (II) pilot-aided and pre-compensation combined with ISFA-enhanced channel estimation. The experimental results demonstrate that the performance of system with the proposed scheme can be improved by about 1.25 dB and 0.37 dB compared with the case I and the case II, respectively, at the BER of 3.8×10-3 after 70 km transmission in standard single mode fiber (SSMF).
Association between mammogram density and background parenchymal enhancement of breast MRI
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Danala, Gopichandh; Wang, Yunzhi; Zarafshani, Ali; Qian, Wei; Liu, Hong; Zheng, Bin
2018-02-01
Breast density has been widely considered as an important risk factor for breast cancer. The purpose of this study is to examine the association between mammogram density results and background parenchymal enhancement (BPE) of breast MRI. A dataset involving breast MR images was acquired from 65 high-risk women. Based on mammography density (BIRADS) results, the dataset was divided into two groups of low and high breast density cases. The Low-Density group has 15 cases with mammographic density (BIRADS 1 and 2), while the High-density group includes 50 cases, which were rated by radiologists as mammographic density BIRADS 3 and 4. A computer-aided detection (CAD) scheme was applied to segment and register breast regions depicted on sequential images of breast MRI scans. CAD scheme computed 20 global BPE features from the entire two breast regions, separately from the left and right breast region, as well as from the bilateral difference between left and right breast regions. An image feature selection method namely, CFS method, was applied to remove the most redundant features and select optimal features from the initial feature pool. Then, a logistic regression classifier was built using the optimal features to predict the mammogram density from the BPE features. Using a leave-one-case-out validation method, the classifier yields the accuracy of 82% and area under ROC curve, AUC=0.81+/-0.09. Also, the box-plot based analysis shows a negative association between mammogram density results and BPE features in the MRI images. This study demonstrated a negative association between mammogram density and BPE of breast MRI images.
NASA Astrophysics Data System (ADS)
Zargari Khuzani, Abolfazl; Danala, Gopichandh; Heidari, Morteza; Du, Yue; Mashhadi, Najmeh; Qiu, Yuchen; Zheng, Bin
2018-02-01
Higher recall rates are a major challenge in mammography screening. Thus, developing computer-aided diagnosis (CAD) scheme to classify between malignant and benign breast lesions can play an important role to improve efficacy of mammography screening. Objective of this study is to develop and test a unique image feature fusion framework to improve performance in classifying suspicious mass-like breast lesions depicting on mammograms. The image dataset consists of 302 suspicious masses detected on both craniocaudal and mediolateral-oblique view images. Amongst them, 151 were malignant and 151 were benign. The study consists of following 3 image processing and feature analysis steps. First, an adaptive region growing segmentation algorithm was used to automatically segment mass regions. Second, a set of 70 image features related to spatial and frequency characteristics of mass regions were initially computed. Third, a generalized linear regression model (GLM) based machine learning classifier combined with a bat optimization algorithm was used to optimally fuse the selected image features based on predefined assessment performance index. An area under ROC curve (AUC) with was used as a performance assessment index. Applying CAD scheme to the testing dataset, AUC was 0.75+/-0.04, which was significantly higher than using a single best feature (AUC=0.69+/-0.05) or the classifier with equally weighted features (AUC=0.73+/-0.05). This study demonstrated that comparing to the conventional equal-weighted approach, using an unequal-weighted feature fusion approach had potential to significantly improve accuracy in classifying between malignant and benign breast masses.
Driving a car with custom-designed fuzzy inferencing VLSI chips and boards
NASA Technical Reports Server (NTRS)
Pin, Francois G.; Watanabe, Yutaka
1993-01-01
Vehicle control in a-priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. Two types of computer boards including custom-designed VLSI chips were developed to add a fuzzy inferencing capability to real-time control systems. All inferencing rules on a chip are processed in parallel, allowing execution of the entire rule base in about 30 microseconds, and therefore, making control of 'reflex-type' of motions envisionable. The use of these boards and the approach using superposition of elemental sensor-based behaviors for the development of qualitative reasoning schemes emulating human-like navigation in a-priori unknown environments are first discussed. Then how the human-like navigation scheme implemented on one of the qualitative inferencing boards was installed on a test-bed platform to investigate two control modes for driving a car in a-priori unknown environments on the basis of sparse and imprecise sensor data is described. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility and robustness of autonomous navigation and/or safety enhancing driver's aid using the new fuzzy inferencing hardware system and some human-like reasoning schemes which may include as little as six elemental behaviors embodied in fourteen qualitative rules.
Xing, Fuyong; Yang, Lin
2016-01-01
Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.
PACS-Based Computer-Aided Detection and Diagnosis
NASA Astrophysics Data System (ADS)
Huang, H. K. (Bernie); Liu, Brent J.; Le, Anh HongTu; Documet, Jorge
The ultimate goal of Picture Archiving and Communication System (PACS)-based Computer-Aided Detection and Diagnosis (CAD) is to integrate CAD results into daily clinical practice so that it becomes a second reader to aid the radiologist's diagnosis. Integration of CAD and Hospital Information System (HIS), Radiology Information System (RIS) or PACS requires certain basic ingredients from Health Level 7 (HL7) standard for textual data, Digital Imaging and Communications in Medicine (DICOM) standard for images, and Integrating the Healthcare Enterprise (IHE) workflow profiles in order to comply with the Health Insurance Portability and Accountability Act (HIPAA) requirements to be a healthcare information system. Among the DICOM standards and IHE workflow profiles, DICOM Structured Reporting (DICOM-SR); and IHE Key Image Note (KIN), Simple Image and Numeric Report (SINR) and Post-processing Work Flow (PWF) are utilized in CAD-HIS/RIS/PACS integration. These topics with examples are presented in this chapter.
An adaptive morphological gradient lifting wavelet for detecting bearing defects
NASA Astrophysics Data System (ADS)
Li, Bing; Zhang, Pei-lin; Mi, Shuang-shan; Hu, Ren-xi; Liu, Dong-sheng
2012-05-01
This paper presents a novel wavelet decomposition scheme, named adaptive morphological gradient lifting wavelet (AMGLW), for detecting bearing defects. The adaptability of the AMGLW consists in that the scheme can select between two filters, mean the average filter and morphological gradient filter, to update the approximation signal based on the local gradient of the analyzed signal. Both a simulated signal and vibration signals acquired from bearing are employed to evaluate and compare the proposed AMGLW scheme with the traditional linear wavelet transform (LWT) and another adaptive lifting wavelet (ALW) developed in literature. Experimental results reveal that the AMGLW outperforms the LW and ALW obviously for detecting bearing defects. The impulsive components can be enhanced and the noise can be depressed simultaneously by the presented AMGLW scheme. Thus the fault characteristic frequencies of bearing can be clearly identified. Furthermore, the AMGLW gets an advantage over LW in computation efficiency. It is quite suitable for online condition monitoring of bearings and other rotating machineries.
New-Sum: A Novel Online ABFT Scheme For General Iterative Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao, Dingwen; Song, Shuaiwen; Krishnamoorthy, Sriram
Emerging high-performance computing platforms, with large component counts and lower power margins, are anticipated to be more susceptible to soft errors in both logic circuits and memory subsystems. We present an online algorithm-based fault tolerance (ABFT) approach to efficiently detect and recover soft errors for general iterative methods. We design a novel checksum-based encoding scheme for matrix-vector multiplication that is resilient to both arithmetic and memory errors. Our design decouples the checksum updating process from the actual computation, and allows adaptive checksum overhead control. Building on this new encoding mechanism, we propose two online ABFT designs that can effectively recovermore » from errors when combined with a checkpoint/rollback scheme.« less
Investigation of an Optimum Detection Scheme for a Star-Field Mapping System
NASA Technical Reports Server (NTRS)
Aldridge, M. D.; Credeur, L.
1970-01-01
An investigation was made to determine the optimum detection scheme for a star-field mapping system that uses coded detection resulting from starlight shining through specially arranged multiple slits of a reticle. The computer solution of equations derived from a theoretical model showed that the greatest probability of detection for a given star and background intensity occurred with the use of a single transparent slit. However, use of multiple slits improved the system's ability to reject the detection of undesirable lower intensity stars, but only by decreasing the probability of detection for lower intensity stars to be mapped. Also, it was found that the coding arrangement affected the root-mean-square star-position error and that detection is possible with error in the system's detected spin rate, though at a reduced probability.
Computerized breast cancer analysis system using three stage semi-supervised learning method.
Sun, Wenqing; Tseng, Tzu-Liang Bill; Zhang, Jianying; Qian, Wei
2016-10-01
A large number of labeled medical image data is usually a requirement to train a well-performed computer-aided detection (CAD) system. But the process of data labeling is time consuming, and potential ethical and logistical problems may also present complications. As a result, incorporating unlabeled data into CAD system can be a feasible way to combat these obstacles. In this study we developed a three stage semi-supervised learning (SSL) scheme that combines a small amount of labeled data and larger amount of unlabeled data. The scheme was modified on our existing CAD system using the following three stages: data weighing, feature selection, and newly proposed dividing co-training data labeling algorithm. Global density asymmetry features were incorporated to the feature pool to reduce the false positive rate. Area under the curve (AUC) and accuracy were computed using 10 fold cross validation method to evaluate the performance of our CAD system. The image dataset includes mammograms from 400 women who underwent routine screening examinations, and each pair contains either two cranio-caudal (CC) or two mediolateral-oblique (MLO) view mammograms from the right and the left breasts. From these mammograms 512 regions were extracted and used in this study, and among them 90 regions were treated as labeled while the rest were treated as unlabeled. Using our proposed scheme, the highest AUC observed in our research was 0.841, which included the 90 labeled data and all the unlabeled data. It was 7.4% higher than using labeled data only. With the increasing amount of labeled data, AUC difference between using mixed data and using labeled data only reached its peak when the amount of labeled data was around 60. This study demonstrated that our proposed three stage semi-supervised learning can improve the CAD performance by incorporating unlabeled data. Using unlabeled data is promising in computerized cancer research and may have a significant impact for future CAD system applications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Lilley, D. G.; Rhode, D. L.
1982-01-01
A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.
1990-06-01
the form of structured objects was first pioneered by Marvin Minsky . In his seminal article " A Framework for Representing Knowl- edge" he introduced... Minsky felt that the existing methods of knowledge representation were too finely grained and he proposed that knowledge is more than just a...not work" in realistic, complex domains. ( Minsky , 1981, pp. 95-128) According to Minsky "A frame is a data-structure for representing a stereo- typed
FPGA design of correlation-based pattern recognition
NASA Astrophysics Data System (ADS)
Jridi, Maher; Alfalou, Ayman
2017-05-01
Optical/Digital pattern recognition and tracking based on optical/digital correlation are a well-known techniques to detect, identify and localize a target object in a scene. Despite the limited number of treatments required by the correlation scheme, computational time and resources are relatively high. The most computational intensive treatment required by the correlation is the transformation from spatial to spectral domain and then from spectral to spatial domain. Furthermore, these transformations are used on optical/digital encryption schemes like the double random phase encryption (DRPE). In this paper, we present a VLSI architecture for the correlation scheme based on the fast Fourier transform (FFT). One interesting feature of the proposed scheme is its ability to stream image processing in order to perform correlation for video sequences. A trade-off between the hardware consumption and the robustness of the correlation can be made in order to understand the limitations of the correlation implementation in reconfigurable and portable platforms. Experimental results obtained from HDL simulations and FPGA prototype have demonstrated the advantages of the proposed scheme.
NASA Astrophysics Data System (ADS)
Qi, Bing; Lougovski, Pavel; Pooser, Raphael; Grice, Warren; Bobrek, Miljko
2015-10-01
Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In this paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a "locally" generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct a coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad2 ), which is small enough to enable secure key distribution. This technology also opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.
Exploring a new bilateral focal density asymmetry based image marker to predict breast cancer risk
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Wang, Yunzhi; Qiu, Yuchen; Liu, Hong; Zheng, Bin
2017-03-01
Although breast density has been widely considered an important breast cancer risk factor, it is not very effective to predict risk of developing breast cancer in a short-term or harboring cancer in mammograms. Based on our recent studies to build short-term breast cancer risk stratification models based on bilateral mammographic density asymmetry, we in this study explored a new quantitative image marker based on bilateral focal density asymmetry to predict the risk of harboring cancers in mammograms. For this purpose, we assembled a testing dataset involving 100 positive and 100 negative cases. In each of positive case, no any solid masses are visible on mammograms. We developed a computer-aided detection (CAD) scheme to automatically detect focal dense regions depicting on two bilateral mammograms of left and right breasts. CAD selects one focal dense region with the maximum size on each image and computes its asymmetrical ratio. We used this focal density asymmetry as a new imaging marker to divide testing cases into two groups of higher and lower focal density asymmetry. The first group included 70 cases in which 62.9% are positive, while the second group included 130 cases in which 43.1% are positive. The odds ratio is 2.24. As a result, this preliminary study supported the feasibility of applying a new focal density asymmetry based imaging marker to predict the risk of having mammography-occult cancers. The goal is to assist radiologists more effectively and accurately detect early subtle cancers using mammography and/or other adjunctive imaging modalities in the future.
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin
2017-03-01
Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.
Mergias, I; Moustakas, K; Papadopoulos, A; Loizidou, M
2007-08-25
Each alternative scheme for treating a vehicle at its end of life has its own consequences from a social, environmental, economic and technical point of view. Furthermore, the criteria used to determine these consequences are often contradictory and not equally important. In the presence of multiple conflicting criteria, an optimal alternative scheme never exists. A multiple-criteria decision aid (MCDA) method to aid the Decision Maker (DM) in selecting the best compromise scheme for the management of End-of-Life Vehicles (ELVs) is presented in this paper. The constitution of a set of alternatives schemes, the selection of a list of relevant criteria to evaluate these alternative schemes and the choice of an appropriate management system are also analyzed in this framework. The proposed procedure relies on the PROMETHEE method which belongs to the well-known family of multiple criteria outranking methods. For this purpose, level, linear and Gaussian functions are used as preference functions.
A Computer-Aided Type-II Fuzzy Image Processing for Diagnosis of Meniscus Tear.
Zarandi, M H Fazel; Khadangi, A; Karimi, F; Turksen, I B
2016-12-01
Meniscal tear is one of the prevalent knee disorders among young athletes and the aging population, and requires correct diagnosis and surgical intervention, if necessary. Not only the errors followed by human intervention but also the obstacles of manual meniscal tear detection highlight the need for automatic detection techniques. This paper presents a type-2 fuzzy expert system for meniscal tear diagnosis using PD magnetic resonance images (MRI). The scheme of the proposed type-2 fuzzy image processing model is composed of three distinct modules: Pre-processing, Segmentation, and Classification. λ-nhancement algorithm is used to perform the pre-processing step. For the segmentation step, first, Interval Type-2 Fuzzy C-Means (IT2FCM) is applied to the images, outputs of which are then employed by Interval Type-2 Possibilistic C-Means (IT2PCM) to perform post-processes. Second stage concludes with re-estimation of "η" value to enhance IT2PCM. Finally, a Perceptron neural network with two hidden layers is used for Classification stage. The results of the proposed type-2 expert system have been compared with a well-known segmentation algorithm, approving the superiority of the proposed system in meniscal tear recognition.
Digital Image Processing Technique for Breast Cancer Detection
NASA Astrophysics Data System (ADS)
Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González
2013-09-01
Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.
Computed-aided diagnosis (CAD) in the detection of breast cancer.
Dromain, C; Boyer, B; Ferré, R; Canale, S; Delaloge, S; Balleyguier, C
2013-03-01
Computer-aided detection (CAD) systems have been developed for interpretation to improve mammographic detection of breast cancer at screening by reducing the number of false-negative interpretation that can be caused by subtle findings, radiologist distraction and complex architecture. They use a digitized mammographic image that can be obtained from both screen-film mammography and full field digital mammography. Its performance in breast cancer detection is dependent on the performance of the CAD itself, the population to which it is applied and the radiologists who use it. There is a clear benefit to the use of CAD in less experienced radiologist and in detecting breast carcinomas presenting as microcalcifications. This review gives a detailed description CAD systems used in mammography and their performance in assistance of reading in screening mammography and as an alternative to double reading. Other CAD systems developed for MRI and ultrasound are also presented and discussed. Copyright © 2012. Published by Elsevier Ireland Ltd.
An improved anonymous authentication scheme for roaming in ubiquitous networks.
Lee, Hakjun; Lee, Donghoon; Moon, Jongho; Jung, Jaewook; Kang, Dongwoo; Kim, Hyoungshick; Won, Dongho
2018-01-01
With the evolution of communication technology and the exponential increase of mobile devices, the ubiquitous networking allows people to use our data and computing resources anytime and everywhere. However, numerous security concerns and complicated requirements arise as these ubiquitous networks are deployed throughout people's lives. To meet the challenge, the user authentication schemes in ubiquitous networks should ensure the essential security properties for the preservation of the privacy with low computational cost. In 2017, Chaudhry et al. proposed a password-based authentication scheme for the roaming in ubiquitous networks to enhance the security. Unfortunately, we found that their scheme remains insecure in its protection of the user privacy. In this paper, we prove that Chaudhry et al.'s scheme is vulnerable to the stolen-mobile device and user impersonation attacks, and its drawbacks comprise the absence of the incorrect login-input detection, the incorrectness of the password change phase, and the absence of the revocation provision. Moreover, we suggest a possible way to fix the security flaw in Chaudhry et al's scheme by using the biometric-based authentication for which the bio-hash is applied in the implementation of a three-factor authentication. We prove the security of the proposed scheme with the random oracle model and formally verify its security properties using a tool named ProVerif, and analyze it in terms of the computational and communication cost. The analysis result shows that the proposed scheme is suitable for resource-constrained ubiquitous environments.
An improved anonymous authentication scheme for roaming in ubiquitous networks
Lee, Hakjun; Lee, Donghoon; Moon, Jongho; Jung, Jaewook; Kang, Dongwoo; Kim, Hyoungshick
2018-01-01
With the evolution of communication technology and the exponential increase of mobile devices, the ubiquitous networking allows people to use our data and computing resources anytime and everywhere. However, numerous security concerns and complicated requirements arise as these ubiquitous networks are deployed throughout people’s lives. To meet the challenge, the user authentication schemes in ubiquitous networks should ensure the essential security properties for the preservation of the privacy with low computational cost. In 2017, Chaudhry et al. proposed a password-based authentication scheme for the roaming in ubiquitous networks to enhance the security. Unfortunately, we found that their scheme remains insecure in its protection of the user privacy. In this paper, we prove that Chaudhry et al.’s scheme is vulnerable to the stolen-mobile device and user impersonation attacks, and its drawbacks comprise the absence of the incorrect login-input detection, the incorrectness of the password change phase, and the absence of the revocation provision. Moreover, we suggest a possible way to fix the security flaw in Chaudhry et al’s scheme by using the biometric-based authentication for which the bio-hash is applied in the implementation of a three-factor authentication. We prove the security of the proposed scheme with the random oracle model and formally verify its security properties using a tool named ProVerif, and analyze it in terms of the computational and communication cost. The analysis result shows that the proposed scheme is suitable for resource-constrained ubiquitous environments. PMID:29505575
Expanding Biosensing Abilities through Computer-Aided Design of Metabolic Pathways.
Libis, Vincent; Delépine, Baudoin; Faulon, Jean-Loup
2016-10-21
Detection of chemical signals is critical for cells in nature as well as in synthetic biology, where they serve as inputs for designer circuits. Important progress has been made in the design of signal processing circuits triggering complex biological behaviors, but the range of small molecules recognized by sensors as inputs is limited. The ability to detect new molecules will increase the number of synthetic biology applications, but direct engineering of tailor-made sensors takes time. Here we describe a way to immediately expand the range of biologically detectable molecules by systematically designing metabolic pathways that transform nondetectable molecules into molecules for which sensors already exist. We leveraged computer-aided design to predict such sensing-enabling metabolic pathways, and we built several new whole-cell biosensors for molecules such as cocaine, parathion, hippuric acid, and nitroglycerin.
Two particle tracking and detection in a single Gaussian beam optical trap.
Praveen, P; Yogesha; Iyengar, Shruthi S; Bhattacharya, Sarbari; Ananthamurthy, Sharath
2016-01-20
We have studied in detail the situation wherein two microbeads are trapped axially in a single-beam Gaussian intensity profile optical trap. We find that the corner frequency extracted from a power spectral density analysis of intensity fluctuations recorded on a quadrant photodetector (QPD) is dependent on the detection scheme. Using forward- and backscattering detection schemes with single and two laser wavelengths along with computer simulations, we conclude that fluctuations detected in backscattering bear true position information of the bead encountered first in the beam propagation direction. Forward scattering, on the other hand, carries position information of both beads with substantial contribution from the bead encountered first along the beam propagation direction. Mie scattering analysis further reveals that the interference term from the scattering of the two beads contributes significantly to the signal, precluding the ability to resolve the positions of the individual beads in forward scattering. In QPD-based detection schemes, detection through backscattering, thereby, is imperative to track the true displacements of axially trapped microbeads for possible studies on light-mediated interbead interactions.
A Computational Geometry Approach to Automated Pulmonary Fissure Segmentation in CT Examinations
Pu, Jiantao; Leader, Joseph K; Zheng, Bin; Knollmann, Friedrich; Fuhrman, Carl; Sciurba, Frank C; Gur, David
2010-01-01
Identification of pulmonary fissures, which form the boundaries between the lobes in the lungs, may be useful during clinical interpretation of CT examinations to assess the early presence and characterization of manifestation of several lung diseases. Motivated by the unique nature of the surface shape of pulmonary fissures in three-dimensional space, we developed a new automated scheme using computational geometry methods to detect and segment fissures depicted on CT images. After a geometric modeling of the lung volume using the Marching Cube Algorithm, Laplacian smoothing is applied iteratively to enhance pulmonary fissures by depressing non-fissure structures while smoothing the surfaces of lung fissures. Next, an Extended Gaussian Image based procedure is used to locate the fissures in a statistical manner that approximates the fissures using a set of plane “patches.” This approach has several advantages such as independence of anatomic knowledge of the lung structure except the surface shape of fissures, limited sensitivity to other lung structures, and ease of implementation. The scheme performance was evaluated by two experienced thoracic radiologists using a set of 100 images (slices) randomly selected from 10 screening CT examinations. In this preliminary evaluation 98.7% and 94.9% of scheme segmented fissure voxels are within 2 mm of the fissures marked independently by two radiologists in the testing image dataset. Using the scheme detected fissures as reference, 89.4% and 90.1% of manually marked fissure points have distance ≤ 2 mm to the reference suggesting a possible under-segmentation of the scheme. The case-based RMS (root-mean-square) distances (“errors”) between our scheme and the radiologist ranged from 1.48±0.92 to 2.04±3.88 mm. The discrepancy of fissure detection results between the automated scheme and either radiologist is smaller in this dataset than the inter-reader variability. PMID:19272987
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2008-03-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
NASA Astrophysics Data System (ADS)
Wang, Xingwei; Zheng, Bin; Li, Shibo; Mulvihill, John J.; Chen, Xiaodong; Liu, Hong
2010-07-01
Karyotyping is an important process to classify chromosomes into standard classes and the results are routinely used by the clinicians to diagnose cancers and genetic diseases. However, visual karyotyping using microscopic images is time-consuming and tedious, which reduces the diagnostic efficiency and accuracy. Although many efforts have been made to develop computerized schemes for automated karyotyping, no schemes can get be performed without substantial human intervention. Instead of developing a method to classify all chromosome classes, we develop an automatic scheme to detect abnormal metaphase cells by identifying a specific class of chromosomes (class 22) and prescreen for suspicious chronic myeloid leukemia (CML). The scheme includes three steps: (1) iteratively segment randomly distributed individual chromosomes, (2) process segmented chromosomes and compute image features to identify the candidates, and (3) apply an adaptive matching template to identify chromosomes of class 22. An image data set of 451 metaphase cells extracted from bone marrow specimens of 30 positive and 30 negative cases for CML is selected to test the scheme's performance. The overall case-based classification accuracy is 93.3% (100% sensitivity and 86.7% specificity). The results demonstrate the feasibility of applying an automated scheme to detect or prescreen the suspicious cancer cases.
Observer training for computer-aided detection of pulmonary nodules in chest radiography.
De Boo, Diederick W; van Hoorn, François; van Schuppen, Joost; Schijf, Laura; Scheerder, Maeke J; Freling, Nicole J; Mets, Onno; Weber, Michael; Schaefer-Prokop, Cornelia M
2012-08-01
To assess whether short-term feedback helps readers to increase their performance using computer-aided detection (CAD) for nodule detection in chest radiography. The 140 CXRs (56 with a solitary CT-proven nodules and 84 negative controls) were divided into four subsets of 35; each were read in a different order by six readers. Lesion presence, location and diagnostic confidence were scored without and with CAD (IQQA-Chest, EDDA Technology) as second reader. Readers received individual feedback after each subset. Sensitivity, specificity and area under the receiver-operating characteristics curve (AUC) were calculated for readings with and without CAD with respect to change over time and impact of CAD. CAD stand-alone sensitivity was 59 % with 1.9 false-positives per image. Mean AUC slightly increased over time with and without CAD (0.78 vs. 0.84 with and 0.76 vs. 0.82 without CAD) but differences did not reach significance. The sensitivity increased (65 % vs. 70 % and 66 % vs. 70 %) and specificity decreased over time (79 % vs. 74 % and 80 % vs. 77 %) but no significant impact of CAD was found. Short-term feedback does not increase the ability of readers to differentiate true- from false-positive candidate lesions and to use CAD more effectively. • Computer-aided detection (CAD) is increasingly used as an adjunct for many radiological techniques. • Short-term feedback does not improve reader performance with CAD in chest radiography. • Differentiation between true- and false-positive CAD for low conspicious possible lesions proves difficult. • CAD can potentially increase reader performance for nodule detection in chest radiography.
Ding, Chao; Yang, Lijun; Wu, Meng
2017-01-01
Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs), adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies. PMID:28098846
Ding, Chao; Yang, Lijun; Wu, Meng
2017-01-15
Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs), adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies.
Cebe, Fatma; Aktan, Ali Murat; Ozsevik, Abdul Semih; Ciftci, Mehmet Ertugrul; Surmelioglu, Hatice Derya
2017-03-01
The aim of this study was to investigate the influence of artifacts produced by different restorative materials on the detection of approximal caries in cone-beam computed tomography (CBCT) scans with and without the application of an artifact-reduction (AR) option. Ninety-eight noncavitated premolar and molar teeth were placed with approximal contacts consisting of 2 sound or carious teeth and 1 mesial-occlusal-distal restored tooth with resin-modified glass-ionomer cement (RMGIC), amalgam, composite, ceramic-based composite (CBC), or computer-aided design-computer-aided manufacturing (CAD-CAM) zirconia materials in between. The teeth were scanned with a CBCT system with and without the AR option. Images were evaluated by 2 observers. The teeth were histologically evaluated, and sensitivity, specificity, and areas under the receiver operating characteristic (ROC) curve were calculated according to the appropriate threshold. Specificity and sensitivity values for contact surfaces ranged from 0-48.39 and 82.93-98.40, respectively. The AR option affected (P < .05) approximal caries detection of the amalgam, composite, CAD-CAM, and CBC groups in contact surfaces and composite and RMGIC groups in noncontact surfaces. Artifacts produced by different restorative materials could affect approximal caries detection in CBCT scans. Use of the AR option with CBCT scans increases the accuracy of approximal caries detection. Copyright © 2016 Elsevier Inc. All rights reserved.
High-speed linear optics quantum computing using active feed-forward.
Prevedel, Robert; Walther, Philip; Tiefenbacher, Felix; Böhi, Pascal; Kaltenbaek, Rainer; Jennewein, Thomas; Zeilinger, Anton
2007-01-04
As information carriers in quantum computing, photonic qubits have the advantage of undergoing negligible decoherence. However, the absence of any significant photon-photon interaction is problematic for the realization of non-trivial two-qubit gates. One solution is to introduce an effective nonlinearity by measurements resulting in probabilistic gate operations. In one-way quantum computation, the random quantum measurement error can be overcome by applying a feed-forward technique, such that the future measurement basis depends on earlier measurement results. This technique is crucial for achieving deterministic quantum computation once a cluster state (the highly entangled multiparticle state on which one-way quantum computation is based) is prepared. Here we realize a concatenated scheme of measurement and active feed-forward in a one-way quantum computing experiment. We demonstrate that, for a perfect cluster state and no photon loss, our quantum computation scheme would operate with good fidelity and that our feed-forward components function with very high speed and low error for detected photons. With present technology, the individual computational step (in our case the individual feed-forward cycle) can be operated in less than 150 ns using electro-optical modulators. This is an important result for the future development of one-way quantum computers, whose large-scale implementation will depend on advances in the production and detection of the required highly entangled cluster states.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... methods help solve imaging problems such as image ``leakage,'' which causes distortion, overloads datasets... enhance detection. This is helpful to identify harmful features such as precancerous polyps or other anomalies. The field of use may be limited to ``computer aided detection in colonography.'' The prospective...
Evaluation schemes for video and image anomaly detection algorithms
NASA Astrophysics Data System (ADS)
Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael
2016-05-01
Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.
Modeling cation/anion-water interactions in functional aluminosilicate structures.
Richards, A J; Barnes, P; Collins, D R; Christodoulos, F; Clark, S M
1995-02-01
A need for the computer simulation of hydration/dehydration processes in functional aluminosilicate structures has been noted. Full and realistic simulations of these systems can be somewhat ambitious and require the aid of interactive computer graphics to identify key structural/chemical units, both in the devising of suitable water-ion simulation potentials and in the analysis of hydrogen-bonding schemes in the subsequent simulation studies. In this article, the former is demonstrated by the assembling of a range of essential water-ion potentials. These span the range of formal charges from +4e to -2e, and are evaluated in the context of three types of structure: a porous zeolite, calcium silicate cement, and layered clay. As an example of the latter, the computer graphics output from Monte Carlo computer simulation studies of hydration/dehydration in calcium-zeolite A is presented.
NASA Astrophysics Data System (ADS)
Li, Wei; Jin, Yuanbin; Yu, Xudong; Zhang, Jing
2017-08-01
We experimentally study a protocol of using the broadband high-frequency squeezed vacuum to detect the low-frequency signal. In this scheme, the lower sideband field of the squeezed light carries the low-frequency modulation signal, and the two strong coherent light fields are applied as the bichromatic local oscillator in the homodyne detection to measure the quantum entanglement of the upper and lower sideband for the broadband squeezed light. The power of one of the local oscillators for detecting the upper sideband can be adjusted to optimize the conditional variance in the low-frequency regime by subtracting the photocurrent of the upper sideband field of the squeezed light from that of the lower sideband field. By means of the quantum correlation of the upper and lower sideband for the broadband squeezed light, the low-frequency signal beyond the standard quantum limit is measured. This scheme is appropriate for enhancing the sensitivity of the low-frequency signal by the aid of the broad squeezed light, such as gravitational waves detection, and does not need to directly produce the low-frequency squeezing in an optical parametric process.
[Medical computer-aided detection method based on deep learning].
Tao, Pan; Fu, Zhongliang; Zhu, Kai; Wang, Lili
2018-03-01
This paper performs a comprehensive study on the computer-aided detection for the medical diagnosis with deep learning. Based on the region convolution neural network and the prior knowledge of target, this algorithm uses the region proposal network, the region of interest pooling strategy, introduces the multi-task loss function: classification loss, bounding box localization loss and object rotation loss, and optimizes it by end-to-end. For medical image it locates the target automatically, and provides the localization result for the next stage task of segmentation. For the detection of left ventricular in echocardiography, proposed additional landmarks such as mitral annulus, endocardial pad and apical position, were used to estimate the left ventricular posture effectively. In order to verify the robustness and effectiveness of the algorithm, the experimental data of ultrasonic and nuclear magnetic resonance images are selected. Experimental results show that the algorithm is fast, accurate and effective.
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Wang, Xingwei; Chen, Xiaodong; Li, Yuhua; Liu, Hong; Li, Shibo; Zheng, Bin
2010-02-01
Visually searching for analyzable metaphase chromosome cells under microscopes is quite time-consuming and difficult. To improve detection efficiency, consistency, and diagnostic accuracy, an automated microscopic image scanning system was developed and tested to directly acquire digital images with sufficient spatial resolution for clinical diagnosis. A computer-aided detection (CAD) scheme was also developed and integrated into the image scanning system to search for and detect the regions of interest (ROI) that contain analyzable metaphase chromosome cells in the large volume of scanned images acquired from one specimen. Thus, the cytogeneticists only need to observe and interpret the limited number of ROIs. In this study, the high-resolution microscopic image scanning and CAD performance was investigated and evaluated using nine sets of images scanned from either bone marrow (three) or blood (six) specimens for diagnosis of leukemia. The automated CAD-selection results were compared with the visual selection. In the experiment, the cytogeneticists first visually searched for the analyzable metaphase chromosome cells from specimens under microscopes. The specimens were also automated scanned and followed by applying the CAD scheme to detect and save ROIs containing analyzable cells while deleting the others. The automated selected ROIs were then examined by a panel of three cytogeneticists. From the scanned images, CAD selected more analyzable cells than initially visual examinations of the cytogeneticists in both blood and bone marrow specimens. In general, CAD had higher performance in analyzing blood specimens. Even in three bone marrow specimens, CAD selected 50, 22, 9 ROIs, respectively. Except matching with the initially visual selection of 9, 7, and 5 analyzable cells in these three specimens, the cytogeneticists also selected 41, 15 and 4 new analyzable cells, which were missed in initially visual searching. This experiment showed the feasibility of applying this CAD-guided high-resolution microscopic image scanning system to prescreen and select ROIs that may contain analyzable metaphase chromosome cells. The success and the further improvement of this automated scanning system may have great impact on the future clinical practice in genetic laboratories to detect and diagnose diseases.
Trellis-coded CPM for satellite-based mobile communications
NASA Technical Reports Server (NTRS)
Abrishamkar, Farrokh; Biglieri, Ezio
1988-01-01
Digital transmission for satellite-based land mobile communications is discussed. To satisfy the power and bandwidth limitations imposed on such systems, a combination of trellis coding and continuous-phase modulated signals are considered. Some schemes based on this idea are presented, and their performance is analyzed by computer simulation. The results obtained show that a scheme based on directional detection and Viterbi decoding appears promising for practical applications.
Using Thermal Radiation in Detection of Negative Obstacles
NASA Technical Reports Server (NTRS)
Rankin, Arturo L.; Matthies, Larry H.
2009-01-01
A method of automated detection of negative obstacles (potholes, ditches, and the like) ahead of ground vehicles at night involves processing of imagery from thermal-infrared cameras aimed at the terrain ahead of the vehicles. The method is being developed as part of an overall obstacle-avoidance scheme for autonomous and semi-autonomous offroad robotic vehicles. The method could also be applied to help human drivers of cars and trucks avoid negative obstacles -- a development that may entail only modest additional cost inasmuch as some commercially available passenger cars are already equipped with infrared cameras as aids for nighttime operation.
Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B; Linguraru, Marius George; Yao, Jianhua; Summers, Ronald M
2015-01-01
Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients and compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e - 3) on all calculi from 1 to 433 mm(3) in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis.
Computer-aided detection of renal calculi from noncontrast CT images using TV-flow and MSER features
Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B.; Linguraru, Marius George; Yao, Jianhua; Summers, Ronald M.
2015-01-01
Purpose: Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. Methods: The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients and compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. Results: At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e − 3) on all calculi from 1 to 433 mm3 in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Conclusions: Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis. PMID:25563255
Mazzoni, Simona; Marchetti, Claudio; Sgarzani, Rossella; Cipriani, Riccardo; Scotti, Roberto; Ciocca, Leonardo
2013-06-01
The aim of the present study was to evaluate the accuracy of prosthetically guided maxillofacial surgery in reconstructing the mandible with a free vascularized flap using custom-made bone plates and a surgical guide to cut the mandible and fibula. The surgical protocol was applied in a study group of seven consecutive mandibular-reconstructed patients who were compared with a control group treated using the standard preplating technique on stereolithographic models (indirect computer-aided design/computer-aided manufacturing method). The precision of both surgical techniques (prosthetically guided maxillofacial surgery and indirect computer-aided design/computer-aided manufacturing procedure) was evaluated by comparing preoperative and postoperative computed tomographic data and assessment of specific landmarks. With regard to midline deviation, no significant difference was documented between the test and control groups. With regard to mandibular angle shift, only one left angle shift on the lateral plane showed a statistically significant difference between the groups. With regard to angular deviation of the body axis, the data showed a significant difference in the arch deviation. All patients in the control group registered greater than 8 degrees of deviation, determining a facial contracture of the external profile at the lower margin of the mandible. With regard to condylar position, the postoperative condylar position was better in the test group than in the control group, although no significant difference was detected. The new protocol for mandibular reconstruction using computer-aided design/computer-aided manufacturing prosthetically guided maxillofacial surgery to construct custom-made guides and plates may represent a viable method of reproducing the patient's anatomical contour, giving the surgeon better procedural control and reducing procedure time. Therapeutic, III.
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-01-01
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-12-15
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less
Jeong, Ji-Wook; Chae, Seung-Hoon; Chae, Eun Young; Kim, Hak Hee; Choi, Young-Wook; Lee, Sooyeul
2016-01-01
We propose computer-aided detection (CADe) algorithm for microcalcification (MC) clusters in reconstructed digital breast tomosynthesis (DBT) images. The algorithm consists of prescreening, MC detection, clustering, and false-positive (FP) reduction steps. The DBT images containing the MC-like objects were enhanced by a multiscale Hessian-based three-dimensional (3D) objectness response function and a connected-component segmentation method was applied to extract the cluster seed objects as potential clustering centers of MCs. Secondly, a signal-to-noise ratio (SNR) enhanced image was also generated to detect the individual MC candidates and prescreen the MC-like objects. Each cluster seed candidate was prescreened by counting neighboring individual MC candidates nearby the cluster seed object according to several microcalcification clustering criteria. As a second step, we introduced bounding boxes for the accepted seed candidate, clustered all the overlapping cubes, and examined. After the FP reduction step, the average number of FPs per case was estimated to be 2.47 per DBT volume with a sensitivity of 83.3%.
Xing, Fuyong; Yang, Lin
2016-01-01
Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to inter-observer variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literatures. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast (DIC), fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation. PMID:26742143
Tanaka, Toyohiko; Nitta, Norihisa; Ohta, Shinichi; Kobayashi, Tsuyoshi; Kano, Akiko; Tsuchiya, Keiko; Murakami, Yoko; Kitahara, Sawako; Wakamiya, Makoto; Furukawa, Akira; Takahashi, Masashi; Murata, Kiyoshi
2009-12-01
A computer-aided detection (CAD) system was evaluated for its ability to detect microcalcifications and masses on images obtained with a digital phase-contrast mammography (PCM) system, a system characterised by the sharp images provided by phase contrast and by the high resolution of 25-μm-pixel mammograms. Fifty abnormal and 50 normal mammograms were collected from about 3,500 mammograms and printed on film for reading on a light box. Seven qualified radiologists participated in an observer study based on receiver operating characteristic (ROC) analysis. The average of the areas under ROC curve (AUC) values for the ROC analysis with and without CAD were 0.927 and 0.897 respectively (P = 0.015). The AUC values improved from 0.840 to 0.888 for microcalcifications (P = 0.034) and from 0.947 to 0.962 for masses (P = 0.025) respectively. The application of CAD to the PCM system is a promising approach for the detection of breast cancer in its early stages.
Computer aided detection of clusters of microcalcifications on full field digital mammograms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ge Jun; Sahiner, Berkman; Hadjiiski, Lubomir M.
2006-08-15
We are developing a computer-aided detection (CAD) system to identify microcalcification clusters (MCCs) automatically on full field digital mammograms (FFDMs). The CAD system includes six stages: preprocessing; image enhancement; segmentation of microcalcification candidates; false positive (FP) reduction for individual microcalcifications; regional clustering; and FP reduction for clustered microcalcifications. At the stage of FP reduction for individual microcalcifications, a truncated sum-of-squares error function was used to improve the efficiency and robustness of the training of an artificial neural network in our CAD system for FFDMs. At the stage of FP reduction for clustered microcalcifications, morphological features and features derived from themore » artificial neural network outputs were extracted from each cluster. Stepwise linear discriminant analysis (LDA) was used to select the features. An LDA classifier was then used to differentiate clustered microcalcifications from FPs. A data set of 96 cases with 192 images was collected at the University of Michigan. This data set contained 96 MCCs, of which 28 clusters were proven by biopsy to be malignant and 68 were proven to be benign. The data set was separated into two independent data sets for training and testing of the CAD system in a cross-validation scheme. When one data set was used to train and validate the convolution neural network (CNN) in our CAD system, the other data set was used to evaluate the detection performance. With the use of a truncated error metric, the training of CNN could be accelerated and the classification performance was improved. The CNN in combination with an LDA classifier could substantially reduce FPs with a small tradeoff in sensitivity. By using the free-response receiver operating characteristic methodology, it was found that our CAD system can achieve a cluster-based sensitivity of 70, 80, and 90 % at 0.21, 0.61, and 1.49 FPs/image, respectively. For case-based performance evaluation, a sensitivity of 70, 80, and 90 % can be achieved at 0.07, 0.17, and 0.65 FPs/image, respectively. We also used a data set of 216 mammograms negative for clustered microcalcifications to further estimate the FP rate of our CAD system. The corresponding FP rates were 0.15, 0.31, and 0.86 FPs/image for cluster-based detection when negative mammograms were used for estimation of FP rates.« less
Symbol Synchronization for Diffusion-Based Molecular Communications.
Jamali, Vahid; Ahmadzadeh, Arman; Schober, Robert
2017-12-01
Symbol synchronization refers to the estimation of the start of a symbol interval and is needed for reliable detection. In this paper, we develop several symbol synchronization schemes for molecular communication (MC) systems where we consider some practical challenges, which have not been addressed in the literature yet. In particular, we take into account that in MC systems, the transmitter may not be equipped with an internal clock and may not be able to emit molecules with a fixed release frequency. Such restrictions hold for practical nanotransmitters, e.g., modified cells, where the lengths of the symbol intervals may vary due to the inherent randomness in the availability of food and energy for molecule generation, the process for molecule production, and the release process. To address this issue, we develop two synchronization-detection frameworks which both employ two types of molecule. In the first framework, one type of molecule is used for symbol synchronization and the other one is used for data detection, whereas in the second framework, both types of molecule are used for joint symbol synchronization and data detection. For both frameworks, we first derive the optimal maximum likelihood (ML) symbol synchronization schemes as performance upper bounds. Since ML synchronization entails high complexity, for each framework, we also propose three low-complexity suboptimal schemes, namely a linear filter-based scheme, a peak observation-based scheme, and a threshold-trigger scheme, which are suitable for MC systems with limited computational capabilities. Furthermore, we study the relative complexity and the constraints associated with the proposed schemes and the impact of the insertion and deletion errors that arise due to imperfect synchronization. Our simulation results reveal the effectiveness of the proposed synchronization schemes and suggest that the end-to-end performance of MC systems significantly depends on the accuracy of the symbol synchronization.
Communication scheme based on evolutionary spatial 2×2 games
NASA Astrophysics Data System (ADS)
Ziaukas, Pranas; Ragulskis, Tautvydas; Ragulskis, Minvydas
2014-06-01
A visual communication scheme based on evolutionary spatial 2×2 games is proposed in this paper. Self-organizing patterns induced by complex interactions between competing individuals are exploited for hiding and transmitting secret visual information. Properties of the proposed communication scheme are discussed in details. It is shown that the hiding capacity of the system (the minimum size of the detectable primitives and the minimum distance between two primitives) is sufficient for the effective transmission of digital dichotomous images. Also, it is demonstrated that the proposed communication scheme is resilient to time backwards, plain image attacks and is highly sensitive to perturbations of private and public keys. Several computational experiments are used to demonstrate the effectiveness of the proposed communication scheme.
ROS-based ground stereo vision detection: implementation and experiments.
Hu, Tianjiang; Zhao, Boxin; Tang, Dengqing; Zhang, Daibing; Kong, Weiwei; Shen, Lincheng
This article concentrates on open-source implementation on flying object detection in cluttered scenes. It is of significance for ground stereo-aided autonomous landing of unmanned aerial vehicles. The ground stereo vision guidance system is presented with details on system architecture and workflow. The Chan-Vese detection algorithm is further considered and implemented in the robot operating systems (ROS) environment. A data-driven interactive scheme is developed to collect datasets for parameter tuning and performance evaluating. The flying vehicle outdoor experiments capture the stereo sequential images dataset and record the simultaneous data from pan-and-tilt unit, onboard sensors and differential GPS. Experimental results by using the collected dataset validate the effectiveness of the published ROS-based detection algorithm.
Low-Complexity Noncoherent Signal Detection for Nanoscale Molecular Communications.
Li, Bin; Sun, Mengwei; Wang, Siyi; Guo, Weisi; Zhao, Chenglin
2016-01-01
Nanoscale molecular communication is a viable way of exchanging information between nanomachines. In this investigation, a low-complexity and noncoherent signal detection technique is proposed to mitigate the inter-symbol-interference (ISI) and additive noise. In contrast to existing coherent detection methods of high complexity, the proposed noncoherent signal detector is more practical when the channel conditions are hard to acquire accurately or hidden from the receiver. The proposed scheme employs the molecular concentration difference to detect the ISI corrupted signals and we demonstrate that it can suppress the ISI effectively. The difference in molecular concentration is a stable characteristic, irrespective of the diffusion channel conditions. In terms of complexity, by excluding matrix operations or likelihood calculations, the new detection scheme is particularly suitable for nanoscale molecular communication systems with a small energy budget or limited computation resource.
Computer-aided detection of basal cell carcinoma through blood content analysis in dermoscopy images
NASA Astrophysics Data System (ADS)
Kharazmi, Pegah; Kalia, Sunil; Lui, Harvey; Wang, Z. Jane; Lee, Tim K.
2018-02-01
Basal cell carcinoma (BCC) is the most common type of skin cancer, which is highly damaging to the skin at its advanced stages and causes huge costs on the healthcare system. However, most types of BCC are easily curable if detected at early stage. Due to limited access to dermatologists and expert physicians, non-invasive computer-aided diagnosis is a viable option for skin cancer screening. A clinical biomarker of cancerous tumors is increased vascularization and excess blood flow. In this paper, we present a computer-aided technique to differentiate cancerous skin tumors from benign lesions based on vascular characteristics of the lesions. Dermoscopy image of the lesion is first decomposed using independent component analysis of the RGB channels to derive melanin and hemoglobin maps. A novel set of clinically inspired features and ratiometric measurements are then extracted from each map to characterize the vascular properties and blood content of the lesion. The feature set is then fed into a random forest classifier. Over a dataset of 664 skin lesions, the proposed method achieved an area under ROC curve of 0.832 in a 10-fold cross validation for differentiating basal cell carcinomas from benign lesions.
Numerical Simulation of Vitiation Effects on a Hydrogen-Fueled Dual-Mode Scramjet
NASA Technical Reports Server (NTRS)
Vyas, Manan A.; Engblom, William A.; Georgiadis, Nicholas J.; Trefny, Charles J.; Bhagwandin, Vishal A.
2010-01-01
The Wind-US computational fluid dynamics (CFD) flow solver was used to simulate dual-mode direct-connect ramjet/scramjet engine flowpath tests conducted in the University of Virginia (UVa) Supersonic Combustion Facility (SCF). The objective was to develop a computational capability within Wind-US to aid current hypersonic research and provide insight to flow as well as chemistry details that are not resolved by instruments available. Computational results are compared with experimental data to validate the accuracy of the numerical modeling. These results include two fuel-off non-reacting and eight fuel-on reacting cases with different equivalence ratios, split between one set with a clean (non-vitiated) air supply and the other set with a vitiated air supply (12 percent H2O vapor). The Peters and Rogg hydrogen-air chemical kinetics model was selected for the scramjet simulations. A limited sensitivity study was done to investigate the choice of turbulence model and inviscid flux scheme and led to the selection of the k-epsilon model and Harten, Lax and van Leer (for contact waves) (HLLC) scheme for general use. Simulation results show reasonably good agreement with experimental data and the overall vitiation effects were captured.
A Kirchhoff approach to seismic modeling and prestack depth migration
NASA Astrophysics Data System (ADS)
Liu, Zhen-Yue
1993-05-01
The Kirchhoff integral provides a robust method for implementing seismic modeling and prestack depth migration, which can handle lateral velocity variation and turning waves. With a little extra computation cost, the Kirchoff-type migration can obtain multiple outputs that have the same phase but different amplitudes, compared with that of other migration methods. The ratio of these amplitudes is helpful in computing some quantities such as reflection angle. I develop a seismic modeling and prestack depth migration method based on the Kirchhoff integral, that handles both laterally variant velocity and a dip beyond 90 degrees. The method uses a finite-difference algorithm to calculate travel times and WKBJ amplitudes for the Kirchhoff integral. Compared to ray-tracing algorithms, the finite-difference algorithm gives an efficient implementation and single-valued quantities (first arrivals) on output. In my finite difference algorithm, the upwind scheme is used to calculate travel times, and the Crank-Nicolson scheme is used to calculate amplitudes. Moreover, interpolation is applied to save computation cost. The modeling and migration algorithms require a smooth velocity function. I develop a velocity-smoothing technique based on damped least-squares to aid in obtaining a successful migration.
Hardware accelerator design for change detection in smart camera
NASA Astrophysics Data System (ADS)
Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Chaudhury, Santanu; Vohra, Anil
2011-10-01
Smart Cameras are important components in Human Computer Interaction. In any remote surveillance scenario, smart cameras have to take intelligent decisions to select frames of significant changes to minimize communication and processing overhead. Among many of the algorithms for change detection, one based on clustering based scheme was proposed for smart camera systems. However, such an algorithm could achieve low frame rate far from real-time requirements on a general purpose processors (like PowerPC) available on FPGAs. This paper proposes the hardware accelerator capable of detecting real time changes in a scene, which uses clustering based change detection scheme. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA board. Resulted frame rate is 30 frames per second for QVGA resolution in gray scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gur, Sourav; Frantziskonis, George N.; Univ. of Arizona, Tucson, AZ
Here, we report results from a numerical study of multi-time-scale bistable dynamics for CO oxidation on a catalytic surface in a flowing, well-mixed gas stream. The problem is posed in terms of surface and gas-phase submodels that dynamically interact in the presence of stochastic perturbations, reflecting the impact of molecular-scale fluctuations on the surface and turbulence in the gas. Wavelet-based methods are used to encode and characterize the temporal dynamics produced by each submodel and detect the onset of sudden state shifts (bifurcations) caused by nonlinear kinetics. When impending state shifts are detected, a more accurate but computationally expensive integrationmore » scheme can be used. This appears to make it possible, at least in some cases, to decrease the net computational burden associated with simulating multi-time-scale, nonlinear reacting systems by limiting the amount of time in which the more expensive integration schemes are required. Critical to achieving this is being able to detect unstable temporal transitions such as the bistable shifts in the example problem considered here. Lastly, our results indicate that a unique wavelet-based algorithm based on the Lipschitz exponent is capable of making such detections, even under noisy conditions, and may find applications in critical transition detection problems beyond catalysis.« less
Road sign recognition with fuzzy adaptive pre-processing models.
Lin, Chien-Chuan; Wang, Ming-Shi
2012-01-01
A road sign recognition system based on adaptive image pre-processing models using two fuzzy inference schemes has been proposed. The first fuzzy inference scheme is to check the changes of the light illumination and rich red color of a frame image by the checking areas. The other is to check the variance of vehicle's speed and angle of steering wheel to select an adaptive size and position of the detection area. The Adaboost classifier was employed to detect the road sign candidates from an image and the support vector machine technique was employed to recognize the content of the road sign candidates. The prohibitory and warning road traffic signs are the processing targets in this research. The detection rate in the detection phase is 97.42%. In the recognition phase, the recognition rate is 93.04%. The total accuracy rate of the system is 92.47%. For video sequences, the best accuracy rate is 90.54%, and the average accuracy rate is 80.17%. The average computing time is 51.86 milliseconds per frame. The proposed system can not only overcome low illumination and rich red color around the road sign problems but also offer high detection rates and high computing performance.
Road Sign Recognition with Fuzzy Adaptive Pre-Processing Models
Lin, Chien-Chuan; Wang, Ming-Shi
2012-01-01
A road sign recognition system based on adaptive image pre-processing models using two fuzzy inference schemes has been proposed. The first fuzzy inference scheme is to check the changes of the light illumination and rich red color of a frame image by the checking areas. The other is to check the variance of vehicle's speed and angle of steering wheel to select an adaptive size and position of the detection area. The Adaboost classifier was employed to detect the road sign candidates from an image and the support vector machine technique was employed to recognize the content of the road sign candidates. The prohibitory and warning road traffic signs are the processing targets in this research. The detection rate in the detection phase is 97.42%. In the recognition phase, the recognition rate is 93.04%. The total accuracy rate of the system is 92.47%. For video sequences, the best accuracy rate is 90.54%, and the average accuracy rate is 80.17%. The average computing time is 51.86 milliseconds per frame. The proposed system can not only overcome low illumination and rich red color around the road sign problems but also offer high detection rates and high computing performance. PMID:22778650
Tan, Maxine; Pu, Jiantao; Zheng, Bin
2014-01-01
Purpose: Improving radiologists’ performance in classification between malignant and benign breast lesions is important to increase cancer detection sensitivity and reduce false-positive recalls. For this purpose, developing computer-aided diagnosis (CAD) schemes has been attracting research interest in recent years. In this study, we investigated a new feature selection method for the task of breast mass classification. Methods: We initially computed 181 image features based on mass shape, spiculation, contrast, presence of fat or calcifications, texture, isodensity, and other morphological features. From this large image feature pool, we used a sequential forward floating selection (SFFS)-based feature selection method to select relevant features, and analyzed their performance using a support vector machine (SVM) model trained for the classification task. On a database of 600 benign and 600 malignant mass regions of interest (ROIs), we performed the study using a ten-fold cross-validation method. Feature selection and optimization of the SVM parameters were conducted on the training subsets only. Results: The area under the receiver operating characteristic curve (AUC) = 0.805±0.012 was obtained for the classification task. The results also showed that the most frequently-selected features by the SFFS-based algorithm in 10-fold iterations were those related to mass shape, isodensity and presence of fat, which are consistent with the image features frequently used by radiologists in the clinical environment for mass classification. The study also indicated that accurately computing mass spiculation features from the projection mammograms was difficult, and failed to perform well for the mass classification task due to tissue overlap within the benign mass regions. Conclusions: In conclusion, this comprehensive feature analysis study provided new and valuable information for optimizing computerized mass classification schemes that may have potential to be useful as a “second reader” in future clinical practice. PMID:24664267
Approximate maximum likelihood decoding of block codes
NASA Technical Reports Server (NTRS)
Greenberger, H. J.
1979-01-01
Approximate maximum likelihood decoding algorithms, based upon selecting a small set of candidate code words with the aid of the estimated probability of error of each received symbol, can give performance close to optimum with a reasonable amount of computation. By combining the best features of various algorithms and taking care to perform each step as efficiently as possible, a decoding scheme was developed which can decode codes which have better performance than those presently in use and yet not require an unreasonable amount of computation. The discussion of the details and tradeoffs of presently known efficient optimum and near optimum decoding algorithms leads, naturally, to the one which embodies the best features of all of them.
NASA Technical Reports Server (NTRS)
Davis, Robert N.; Polites, Michael E.; Trevino, Luis C.
2004-01-01
This paper details a novel scheme for autonomous component health management (ACHM) with failed actuator detection and failed sensor detection, identification, and avoidance. This new scheme has features that far exceed the performance of systems with triple-redundant sensing and voting, yet requires fewer sensors and could be applied to any system with redundant sensing. Relevant background to the ACHM scheme is provided, and the simulation results for the application of that scheme to a single-axis spacecraft attitude control system with a 3rd order plant and dual-redundant measurement of system states are presented. ACHM fulfills key functions needed by an integrated vehicle health monitoring (IVHM) system. It is: autonomous; adaptive; works in realtime; provides optimal state estimation; identifies failed components; avoids failed components; reconfigures for multiple failures; reconfigures for intermittent failures; works for hard-over, soft, and zero-output failures; and works for both open- and closed-loop systems. The ACHM scheme combines a prefilter that generates preliminary state estimates, detects and identifies failed sensors and actuators, and avoids the use of failed sensors in state estimation with a fixed-gain Kalman filter that generates optimal state estimates and provides model-based state estimates that comprise an integral part of the failure detection logic. The results show that ACHM successfully isolates multiple persistent and intermittent hard-over, soft, and zero-output failures. It is now ready to be tested on a computer model of an actual system.
Terrestrial implications of mathematical modeling developed for space biomedical research
NASA Technical Reports Server (NTRS)
Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini
1988-01-01
This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.
The analysis of HIV/AIDS drug-resistant on networks
NASA Astrophysics Data System (ADS)
Liu, Maoxing
2014-01-01
In this paper, we present an Human Immunodeficiency Virus (HIV)/Acquired Immune Deficiency Syndrome (AIDS) drug-resistant model using an ordinary differential equation (ODE) model on scale-free networks. We derive the threshold for the epidemic to be zero in infinite scale-free network. We also prove the stability of disease-free equilibrium (DFE) and persistence of HIV/AIDS infection. The effects of two immunization schemes, including proportional scheme and targeted vaccination, are studied and compared. We find that targeted strategy compare favorably to a proportional condom using has prominent effect to control HIV/AIDS spread on scale-free networks.
Investigation on improved Gabor order tracking technique and its applications
NASA Astrophysics Data System (ADS)
Pan, Min-Chun; Chiu, Chun-Ching
2006-08-01
The study proposes an improved Gabor order tracking (GOT) technique to cope with crossing-order/spectral components that cannot be effectively separated by using the original GOT scheme. The improvement aids both the reconstruction and interpretation of two crossing orders/spectra such as a transmission-element-regarding order and a structural resonance. The dual function of the Gabor elementary function can affect the precision of tracked orders. In the paper, its influence on the computed Gabor expansion coefficients is investigated. For applying the improved scheme in practical works, the separation and extraction of close-order components of vibration signals measured from a transmission-element test bench is illustrated by using both the GOT and Vold-Kalman filtering OT methods. Additionally, comparisons between these two schemes are summarized from processing results. The other experimental work demonstrates the ranking of noise components from a riding electric scooter. Singled-out dominant noise sources can be referred for subsequent design-remodeling tasks.
Vamsy, Mohana; Dattatreya, PS; Parakh, Megha; Dayal, Monal; Rao, VVS Prabhakar
2013-01-01
Primary testicular lymphoma (PTL) a relatively rare disease of non-Hodgkin's lymphomas occurring with a lesser incidence of 1-2% has a propensity to occur at later ages above 50 years. PTL spreads to extra nodal sites due to deficiency of extra cellular adhesion molecules. We present detection of multiple sites of extra nodal involvement of PTL by F-18 positron emission tomography/computed tomography study aiding early detection of the dissemination thus aiding in staging and management. PMID:24019676
Computer-aided diagnostics of screening mammography using content-based image retrieval
NASA Astrophysics Data System (ADS)
Deserno, Thomas M.; Soiron, Michael; de Oliveira, Júlia E. E.; de A. Araújo, Arnaldo
2012-03-01
Breast cancer is one of the main causes of death among women in occidental countries. In the last years, screening mammography has been established worldwide for early detection of breast cancer, and computer-aided diagnostics (CAD) is being developed to assist physicians reading mammograms. A promising method for CAD is content-based image retrieval (CBIR). Recently, we have developed a classification scheme of suspicious tissue pattern based on the support vector machine (SVM). In this paper, we continue moving towards automatic CAD of screening mammography. The experiments are based on in total 10,509 radiographs that have been collected from different sources. From this, 3,375 images are provided with one and 430 radiographs with more than one chain code annotation of cancerous regions. In different experiments, this data is divided into 12 and 20 classes, distinguishing between four categories of tissue density, three categories of pathology and in the 20 class problem two categories of different types of lesions. Balancing the number of images in each class yields 233 and 45 images remaining in each of the 12 and 20 classes, respectively. Using a two-dimensional principal component analysis, features are extracted from small patches of 128 x 128 pixels and classified by means of a SVM. Overall, the accuracy of the raw classification was 61.6 % and 52.1 % for the 12 and the 20 class problem, respectively. The confusion matrices are assessed for detailed analysis. Furthermore, an implementation of a SVM-based CBIR system for CADx in screening mammography is presented. In conclusion, with a smarter patch extraction, the CBIR approach might reach precision rates that are helpful for the physicians. This, however, needs more comprehensive evaluation on clinical data.
Computer-aided marginal artery detection on computed tomographic colonography
NASA Astrophysics Data System (ADS)
Wei, Zhuoshi; Yao, Jianhua; Wang, Shijun; Liu, Jiamin; Summers, Ronald M.
2012-03-01
Computed tomographic colonography (CTC) is a minimally invasive technique for colonic polyps and cancer screening. The marginal artery of the colon, also known as the marginal artery of Drummond, is the blood vessel that connects the inferior mesenteric artery with the superior mesenteric artery. The marginal artery runs parallel to the colon for its entire length, providing the blood supply to the colon. Detecting the marginal artery may benefit computer-aided detection (CAD) of colonic polyp. It can be used to identify teniae coli based on their anatomic spatial relationship. It can also serve as an alternative marker for colon localization, in case of colon collapse and inability to directly compute the endoluminal centerline. This paper proposes an automatic method for marginal artery detection on CTC. To the best of our knowledge, this is the first work presented for this purpose. Our method includes two stages. The first stage extracts the blood vessels in the abdominal region. The eigenvalue of Hessian matrix is used to detect line-like structures in the images. The second stage is to reduce the false positives in the first step. We used two different masks to exclude the false positive vessel regions. One is a dilated colon mask which is obtained by colon segmentation. The other is an eroded visceral fat mask which is obtained by fat segmentation in the abdominal region. We tested our method on a CTC dataset with 6 cases. Using ratio-of-overlap with manual labeling of the marginal artery as the standard-of-reference, our method yielded true positive, false positive and false negative fractions of 89%, 33%, 11%, respectively.
Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo
2011-01-01
Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.
Heidari, Morteza; Khuzani, Abolfazl Zargari; Hollingsworth, Alan B; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin
2018-01-30
In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.
NASA Astrophysics Data System (ADS)
Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin
2018-02-01
In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.
A new fractional order derivative based active contour model for colon wall segmentation
NASA Astrophysics Data System (ADS)
Chen, Bo; Li, Lihong C.; Wang, Huafeng; Wei, Xinzhou; Huang, Shan; Chen, Wensheng; Liang, Zhengrong
2018-02-01
Segmentation of colon wall plays an important role in advancing computed tomographic colonography (CTC) toward a screening modality. Due to the low contrast of CT attenuation around colon wall, accurate segmentation of the boundary of both inner and outer wall is very challenging. In this paper, based on the geodesic active contour model, we develop a new model for colon wall segmentation. First, tagged materials in CTC images were automatically removed via a partial volume (PV) based electronic colon cleansing (ECC) strategy. We then present a new fractional order derivative based active contour model to segment the volumetric colon wall from the cleansed CTC images. In this model, the regionbased Chan-Vese model is incorporated as an energy term to the whole model so that not only edge/gradient information but also region/volume information is taken into account in the segmentation process. Furthermore, a fractional order differentiation derivative energy term is also developed in the new model to preserve the low frequency information and improve the noise immunity of the new segmentation model. The proposed colon wall segmentation approach was validated on 16 patient CTC scans. Experimental results indicate that the present scheme is very promising towards automatically segmenting colon wall, thus facilitating computer aided detection of initial colonic polyp candidates via CTC.
Shin, Hoo-Chang; Roth, Holger R; Gao, Mingchen; Lu, Le; Xu, Ziyue; Nogues, Isabella; Yao, Jianhua; Mollura, Daniel; Summers, Ronald M
2016-05-01
Remarkable progress has been made in image recognition, primarily due to the availability of large-scale annotated datasets and deep convolutional neural networks (CNNs). CNNs enable learning data-driven, highly representative, hierarchical image features from sufficient training data. However, obtaining datasets as comprehensively annotated as ImageNet in the medical imaging domain remains a challenge. There are currently three major techniques that successfully employ CNNs to medical image classification: training the CNN from scratch, using off-the-shelf pre-trained CNN features, and conducting unsupervised CNN pre-training with supervised fine-tuning. Another effective method is transfer learning, i.e., fine-tuning CNN models pre-trained from natural image dataset to medical image tasks. In this paper, we exploit three important, but previously understudied factors of employing deep convolutional neural networks to computer-aided detection problems. We first explore and evaluate different CNN architectures. The studied models contain 5 thousand to 160 million parameters, and vary in numbers of layers. We then evaluate the influence of dataset scale and spatial image context on performance. Finally, we examine when and why transfer learning from pre-trained ImageNet (via fine-tuning) can be useful. We study two specific computer-aided detection (CADe) problems, namely thoraco-abdominal lymph node (LN) detection and interstitial lung disease (ILD) classification. We achieve the state-of-the-art performance on the mediastinal LN detection, and report the first five-fold cross-validation classification results on predicting axial CT slices with ILD categories. Our extensive empirical evaluation, CNN model analysis and valuable insights can be extended to the design of high performance CAD systems for other medical imaging tasks.
Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.; ...
2015-10-21
Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less
Hirose, Tomohiro; Nitta, Norihisa; Shiraishi, Junji; Nagatani, Yukihiro; Takahashi, Masashi; Murata, Kiyoshi
2008-12-01
The aim of this study was to evaluate the usefulness of computer-aided diagnosis (CAD) software for the detection of lung nodules on multidetector-row computed tomography (MDCT) in terms of improvement in radiologists' diagnostic accuracy in detecting lung nodules, using jackknife free-response receiver-operating characteristic (JAFROC) analysis. Twenty-one patients (6 without and 15 with lung nodules) were selected randomly from 120 consecutive thoracic computed tomographic examinations. The gold standard for the presence or absence of nodules in the observer study was determined by consensus of two radiologists. Six expert radiologists participated in a free-response receiver operating characteristic study for the detection of lung nodules on MDCT, in which cases were interpreted first without and then with the output of CAD software. Radiologists were asked to indicate the locations of lung nodule candidates on the monitor with their confidence ratings for the presence of lung nodules. The performance of the CAD software indicated that the sensitivity in detecting lung nodules was 71.4%, with 0.95 false-positive results per case. When radiologists used the CAD software, the average sensitivity improved from 39.5% to 81.0%, with an increase in the average number of false-positive results from 0.14 to 0.89 per case. The average figure-of-merit values for the six radiologists were 0.390 without and 0.845 with the output of the CAD software, and there was a statistically significant difference (P < .0001) using the JAFROC analysis. The CAD software for the detection of lung nodules on MDCT has the potential to assist radiologists by increasing their accuracy.
NASA Astrophysics Data System (ADS)
Lee, Joon K.; Chan, Tao; Liu, Brent J.; Huang, H. K.
2009-02-01
Detection of acute intracranial hemorrhage (AIH) is a primary task in the interpretation of computed tomography (CT) brain scans of patients suffering from acute neurological disturbances or after head trauma. Interpretation can be difficult especially when the lesion is inconspicuous or the reader is inexperienced. We have previously developed a computeraided detection (CAD) algorithm to detect small AIH. One hundred and thirty five small AIH CT studies from the Los Angeles County (LAC) + USC Hospital were identified and matched by age and sex with one hundred and thirty five normal studies. These cases were then processed using our AIH CAD system to evaluate the efficacy and constraints of the algorithm.
Saliency detection using mutual consistency-guided spatial cues combination
NASA Astrophysics Data System (ADS)
Wang, Xin; Ning, Chen; Xu, Lizhong
2015-09-01
Saliency detection has received extensive interests due to its remarkable contribution to wide computer vision and pattern recognition applications. However, most existing computational models are designed for detecting saliency in visible images or videos. When applied to infrared images, they may suffer from limitations in saliency detection accuracy and robustness. In this paper, we propose a novel algorithm to detect visual saliency in infrared images by mutual consistency-guided spatial cues combination. First, based on the luminance contrast and contour characteristics of infrared images, two effective saliency maps, i.e., the luminance contrast saliency map and contour saliency map are constructed, respectively. Afterwards, an adaptive combination scheme guided by mutual consistency is exploited to integrate these two maps to generate the spatial saliency map. This idea is motivated by the observation that different maps are actually related to each other and the fusion scheme should present a logically consistent view of these maps. Finally, an enhancement technique is adopted to incorporate spatial saliency maps at various scales into a unified multi-scale framework to improve the reliability of the final saliency map. Comprehensive evaluations on real-life infrared images and comparisons with many state-of-the-art saliency models demonstrate the effectiveness and superiority of the proposed method for saliency detection in infrared images.
Automatic breast tissue density estimation scheme in digital mammography images
NASA Astrophysics Data System (ADS)
Menechelli, Renan C.; Pacheco, Ana Luisa V.; Schiabel, Homero
2017-03-01
Cases of breast cancer have increased substantially each year. However, radiologists are subject to subjectivity and failures of interpretation which may affect the final diagnosis in this examination. The high density features in breast tissue are important factors related to these failures. Thus, among many functions some CADx (Computer-Aided Diagnosis) schemes are classifying breasts according to the predominant density. In order to aid in such a procedure, this work attempts to describe automated software for classification and statistical information on the percentage change in breast tissue density, through analysis of sub regions (ROIs) from the whole mammography image. Once the breast is segmented, the image is divided into regions from which texture features are extracted. Then an artificial neural network MLP was used to categorize ROIs. Experienced radiologists have previously determined the ROIs density classification, which was the reference to the software evaluation. From tests results its average accuracy was 88.7% in ROIs classification, and 83.25% in the classification of the whole breast density in the 4 BI-RADS density classes - taking into account a set of 400 images. Furthermore, when considering only a simplified two classes division (high and low densities) the classifier accuracy reached 93.5%, with AUC = 0.95.
Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Goldberg, J.; Kautz, W. H.; Melliar-Smith, P. M.; Green, M. W.; Levitt, K. N.; Schwartz, R. L.; Weinstock, C. B.
1984-01-01
SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness.
Capacity of a direct detection optical communication channel
NASA Technical Reports Server (NTRS)
Tan, H. H.
1980-01-01
The capacity of a free space optical channel using a direct detection receiver is derived under both peak and average signal power constraints and without a signal bandwidth constraint. The addition of instantaneous noiseless feedback from the receiver to the transmitter does not increase the channel capacity. In the absence of received background noise, an optimally coded PPM system is shown to achieve capacity in the limit as signal bandwidth approaches infinity. In the case of large peak to average signal power ratios, an interleaved coding scheme with PPM modulation is shown to have a computational cutoff rate far greater than ordinary coding schemes.
Fault Tolerance for VLSI Multicomputers
1985-08-01
that consists of hundreds or thousands of VLSI computation nodes interconnected by dedicated links. Some important applications of high-end computers...technology, and intended applications . A proposed fault tolerance scheme combines hardware that performs error detection and system-level protocols for...order to recover from the error and resume correct operation, a valid system state must be restored. A low-overhead, application -transparent error
Hybrid threshold adaptable quantum secret sharing scheme with reverse Huffman-Fibonacci-tree coding.
Lai, Hong; Zhang, Jun; Luo, Ming-Xing; Pan, Lei; Pieprzyk, Josef; Xiao, Fuyuan; Orgun, Mehmet A
2016-08-12
With prevalent attacks in communication, sharing a secret between communicating parties is an ongoing challenge. Moreover, it is important to integrate quantum solutions with classical secret sharing schemes with low computational cost for the real world use. This paper proposes a novel hybrid threshold adaptable quantum secret sharing scheme, using an m-bonacci orbital angular momentum (OAM) pump, Lagrange interpolation polynomials, and reverse Huffman-Fibonacci-tree coding. To be exact, we employ entangled states prepared by m-bonacci sequences to detect eavesdropping. Meanwhile, we encode m-bonacci sequences in Lagrange interpolation polynomials to generate the shares of a secret with reverse Huffman-Fibonacci-tree coding. The advantages of the proposed scheme is that it can detect eavesdropping without joint quantum operations, and permits secret sharing for an arbitrary but no less than threshold-value number of classical participants with much lower bandwidth. Also, in comparison with existing quantum secret sharing schemes, it still works when there are dynamic changes, such as the unavailability of some quantum channel, the arrival of new participants and the departure of participants. Finally, we provide security analysis of the new hybrid quantum secret sharing scheme and discuss its useful features for modern applications.
Hybrid threshold adaptable quantum secret sharing scheme with reverse Huffman-Fibonacci-tree coding
Lai, Hong; Zhang, Jun; Luo, Ming-Xing; Pan, Lei; Pieprzyk, Josef; Xiao, Fuyuan; Orgun, Mehmet A.
2016-01-01
With prevalent attacks in communication, sharing a secret between communicating parties is an ongoing challenge. Moreover, it is important to integrate quantum solutions with classical secret sharing schemes with low computational cost for the real world use. This paper proposes a novel hybrid threshold adaptable quantum secret sharing scheme, using an m-bonacci orbital angular momentum (OAM) pump, Lagrange interpolation polynomials, and reverse Huffman-Fibonacci-tree coding. To be exact, we employ entangled states prepared by m-bonacci sequences to detect eavesdropping. Meanwhile, we encode m-bonacci sequences in Lagrange interpolation polynomials to generate the shares of a secret with reverse Huffman-Fibonacci-tree coding. The advantages of the proposed scheme is that it can detect eavesdropping without joint quantum operations, and permits secret sharing for an arbitrary but no less than threshold-value number of classical participants with much lower bandwidth. Also, in comparison with existing quantum secret sharing schemes, it still works when there are dynamic changes, such as the unavailability of some quantum channel, the arrival of new participants and the departure of participants. Finally, we provide security analysis of the new hybrid quantum secret sharing scheme and discuss its useful features for modern applications. PMID:27515908
Low-complexity R-peak detection for ambulatory fetal monitoring.
Rooijakkers, Michael J; Rabotti, Chiara; Oei, S Guid; Mischi, Massimo
2012-07-01
Non-invasive fetal health monitoring during pregnancy is becoming increasingly important because of the increasing number of high-risk pregnancies. Despite recent advances in signal-processing technology, which have enabled fetal monitoring during pregnancy using abdominal electrocardiogram (ECG) recordings, ubiquitous fetal health monitoring is still unfeasible due to the computational complexity of noise-robust solutions. In this paper, an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, without reducing the R-peak detection performance compared to the existing R-peak detection schemes. Validation of the algorithm is performed on three manually annotated datasets. With a detection error rate of 0.23%, 1.32% and 9.42% on the MIT/BIH Arrhythmia and in-house maternal and fetal databases, respectively, the detection rate of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.
Novel MDM-PON scheme utilizing self-homodyne detection for high-speed/capacity access networks.
Chen, Yuanxiang; Li, Juhao; Zhu, Paikun; Wu, Zhongying; Zhou, Peng; Tian, Yu; Ren, Fang; Yu, Jinyi; Ge, Dawei; Chen, Jingbiao; He, Yongqi; Chen, Zhangyuan
2015-12-14
In this paper, we propose a cost-effective, energy-saving mode-division-multiplexing passive optical network (MDM-PON) scheme utilizing self-homodyne detection for high-speed/capacity access network based on low modal-crosstalk few-mode fiber (FMF) and all-fiber mode multiplexer/demultiplexer (MUX/DEMUX). In the proposed scheme, one of the spatial modes is used to transmit a portion of signal carrier (namely pilot-tone) as the local oscillator (LO), while the others are used for signal-bearing channels. At the receiver, the pilot-tone and the signal can be separated without strong crosstalk and sent to the receiver for coherent detection. The spectral efficiency (SE) is significantly enhanced when multiple spatial channels are used. Meanwhile, the self-homodyne detection scheme can effectively suppress laser phase noise, which relaxes the requirement for the lasers line-width at the optical line terminal or optical network units (OLT/ONUs). The digital signal processing (DSP) at the receiver is also simplified since it removes the need for frequency offset compensation and complex phase correction, which reduces the computational complexity and energy consumption. Polarization division multiplexing (PDM) that offers doubled SE is also supported by the scheme. The proposed scheme is scalable to multi-wavelength application when wavelength MUX/DEMUX is utilized. Utilizing the proposed scheme, we demonstrate a proof of concept 4 × 40-Gb/s orthogonal frequency division multiplexing (OFDM) transmission over 55-km FMF using low modal-crosstalk two-mode FMF and MUX/DEMUX with error free operation. Compared with back to back case, less than 1-dB Q-factor penalty is observed after 55-km FMF of the four channels. Signal power and pilot-tone power are also optimized to achieve the optimal transmission performance.
75 FR 16123 - Dave & Buster’s, Inc.; Analysis of Proposed Consent Order to Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... computer networks or to conduct security investigations, such as by employing an intrusion detection system and monitoring system logs; (b) failed to adequately restrict third-party access to its networks, such... reasonable and appropriate security for personal information on its computer networks. Among other things...
Automatic Detection of Frontal Face Midline by Chain-coded Merlin-Farber Hough Trasform
NASA Astrophysics Data System (ADS)
Okamoto, Daichi; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka
We propose a novel approach for detection of the facial midline (facial symmetry axis) from a frontal face image. The facial midline has several applications, for instance reducing computational cost required for facial feature extraction (FFE) and postoperative assessment for cosmetic or dental surgery. The proposed method detects the facial midline of a frontal face from an edge image as the symmetry axis using the Merlin-Faber Hough transformation. And a new performance improvement scheme for midline detection by MFHT is present. The main concept of the proposed scheme is suppression of redundant vote on the Hough parameter space by introducing chain code representation for the binary edge image. Experimental results on the image dataset containing 2409 images from FERET database indicate that the proposed algorithm can improve the accuracy of midline detection from 89.9% to 95.1 % for face images with different scales and rotation.
NASA Astrophysics Data System (ADS)
Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao
2018-02-01
A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.
Free-Space Quantum Signatures Using Heterodyne Measurements
NASA Astrophysics Data System (ADS)
Croal, Callum; Peuntinger, Christian; Heim, Bettina; Khan, Imran; Marquardt, Christoph; Leuchs, Gerd; Wallden, Petros; Andersson, Erika; Korolkova, Natalia
2016-09-01
Digital signatures guarantee the authorship of electronic communications. Currently used "classical" signature schemes rely on unproven computational assumptions for security, while quantum signatures rely only on the laws of quantum mechanics to sign a classical message. Previous quantum signature schemes have used unambiguous quantum measurements. Such measurements, however, sometimes give no result, reducing the efficiency of the protocol. Here, we instead use heterodyne detection, which always gives a result, although there is always some uncertainty. We experimentally demonstrate feasibility in a real environment by distributing signature states through a noisy 1.6 km free-space channel. Our results show that continuous-variable heterodyne detection improves the signature rate for this type of scheme and therefore represents an interesting direction in the search for practical quantum signature schemes. For transmission values ranging from 100% to 10%, but otherwise assuming an ideal implementation with no other imperfections, the signature length is shorter by a factor of 2 to 10. As compared with previous relevant experimental realizations, the signature length in this implementation is several orders of magnitude shorter.
Längkvist, Martin; Jendeberg, Johan; Thunberg, Per; Loutfi, Amy; Lidén, Mats
2018-06-01
Computed tomography (CT) is the method of choice for diagnosing ureteral stones - kidney stones that obstruct the ureter. The purpose of this study is to develop a computer aided detection (CAD) algorithm for identifying a ureteral stone in thin slice CT volumes. The challenge in CAD for urinary stones lies in the similarity in shape and intensity of stones with non-stone structures and how to efficiently deal with large high-resolution CT volumes. We address these challenges by using a Convolutional Neural Network (CNN) that works directly on the high resolution CT volumes. The method is evaluated on a large data base of 465 clinically acquired high-resolution CT volumes of the urinary tract with labeling of ureteral stones performed by a radiologist. The best model using 2.5D input data and anatomical information achieved a sensitivity of 100% and an average of 2.68 false-positives per patient on a test set of 88 scans. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
B. Shokouhi, Shahriar; Fooladivanda, Aida; Ahmadinejad, Nasrin
2017-12-01
A computer-aided detection (CAD) system is introduced in this paper for detection of breast lesions in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The proposed CAD system firstly compensates motion artifacts and segments the breast region. Then, the potential lesion voxels are detected and used as the initial seed points for the seeded region-growing algorithm. A new and robust region-growing algorithm incorporating with Fuzzy C-means (FCM) clustering and vesselness filter is proposed to segment any potential lesion regions. Subsequently, the false positive detections are reduced by applying a discrimination step. This is based on 3D morphological characteristics of the potential lesion regions and kinetic features which are fed to the support vector machine (SVM) classifier. The performance of the proposed CAD system is evaluated using the free-response operating characteristic (FROC) curve. We introduce our collected dataset that includes 76 DCE-MRI studies, 63 malignant and 107 benign lesions. The prepared dataset has been used to verify the accuracy of the proposed CAD system. At 5.29 false positives per case, the CAD system accurately detects 94% of the breast lesions.
NASA Astrophysics Data System (ADS)
Boscheri, Walter; Dumbser, Michael
2017-10-01
We present a new family of high order accurate fully discrete one-step Discontinuous Galerkin (DG) finite element schemes on moving unstructured meshes for the solution of nonlinear hyperbolic PDE in multiple space dimensions, which may also include parabolic terms in order to model dissipative transport processes, like molecular viscosity or heat conduction. High order piecewise polynomials of degree N are adopted to represent the discrete solution at each time level and within each spatial control volume of the computational grid, while high order of accuracy in time is achieved by the ADER approach, making use of an element-local space-time Galerkin finite element predictor. A novel nodal solver algorithm based on the HLL flux is derived to compute the velocity for each nodal degree of freedom that describes the current mesh geometry. In our algorithm the spatial mesh configuration can be defined in two different ways: either by an isoparametric approach that generates curved control volumes, or by a piecewise linear decomposition of each spatial control volume into simplex sub-elements. Each technique generates a corresponding number of geometrical degrees of freedom needed to describe the current mesh configuration and which must be considered by the nodal solver for determining the grid velocity. The connection of the old mesh configuration at time tn with the new one at time t n + 1 provides the space-time control volumes on which the governing equations have to be integrated in order to obtain the time evolution of the discrete solution. Our numerical method belongs to the category of so-called direct Arbitrary-Lagrangian-Eulerian (ALE) schemes, where a space-time conservation formulation of the governing PDE system is considered and which already takes into account the new grid geometry (including a possible rezoning step) directly during the computation of the numerical fluxes. We emphasize that our method is a moving mesh method, as opposed to total Lagrangian formulations that are based on a fixed computational grid and which instead evolve the mapping of the reference configuration to the current one. Our new Lagrangian-type DG scheme adopts the novel a posteriori sub-cell finite volume limiter method recently developed in [62] for fixed unstructured grids. In this approach, the validity of the candidate solution produced in each cell by an unlimited ADER-DG scheme is verified against a set of physical and numerical detection criteria, such as the positivity of pressure and density, the absence of floating point errors (NaN) and the satisfaction of a relaxed discrete maximum principle (DMP) in the sense of polynomials. Those cells which do not satisfy all of the above criteria are flagged as troubled cells and are recomputed at the aid of a more robust second order TVD finite volume scheme. To preserve the subcell resolution capability of the original DG scheme, the FV limiter is run on a sub-grid that is 2 N + 1 times finer compared to the mesh of the original unlimited DG scheme. The new subcell averages are then gathered back into a high order DG polynomial by a usual conservative finite volume reconstruction operator. The numerical convergence rates of the new ALE ADER-DG schemes are studied up to fourth order in space and time and several test problems are simulated in order to check the accuracy and the robustness of the proposed numerical method in the context of the Euler and Navier-Stokes equations for compressible gas dynamics, considering both inviscid and viscous fluids. Finally, an application inspired by Inertial Confinement Fusion (ICF) type flows is considered by solving the Euler equations and the PDE of viscous and resistive magnetohydrodynamics (VRMHD).
Musiimenta, Angella
2012-01-01
Background: Although Uganda had recorded declines in HIV infection rates around 1990’s, it is argued that HIV/AIDS risk sexual behaviour, especially among the youth, started increasing again from early 2000. School-based computer-assisted HIV interventions can provide interactive ways of improving the youth’s HIV knowledge, attitudes and skills. However, these interventions have long been reported to have limited success in improving the youth’s sexual behaviours, which is always the major aim of implementing such interventions. This could be because the commonly used health promotion theories employed by these interventions have limited application in HIV prevention. These theories tend to lack sufficient attention to contextual mediators that influence ones sexual behaviours. Moreover, literature increasingly expresses dissatisfaction with the dominant prevailing descriptive survey-type HIV/AIDS-related research. Objective and Methods: The objective of this research was to identify contextual mediators that influence the youth’s decision to adopt and maintain the HIV/AIDS preventive behaviour advocated by a computer-assisted intervention. To achieve this objective, this research employed qualitative method, which provided in-depth understanding of how different contexts interact to influence the effectiveness of HIV/AIDS interventions. The research question was: What contextual mediators are influencing the youth’s decision to adopt and maintain the HIV/AIDS preventive behaviour advocated by a computer-assisted intervention? To answer this research question, 20 youth who had previously completed the WSWM intervention when they were still in secondary schools were telephone interviewed between Sept.08 and Dec.08. The collected data was then analysed, based on grounded theory’s coding scheme. Results: Findings demonstrate that although often ignored by HIV interventionists and researchers, variety of contextual mediators influence individual uptake of HIV preventives. These include relationship characteristics, familial mediators, peer influence, gender-based social norms, economic factors and religious beliefs. Conclusion: To generate concomitant mutual efforts, rather than exclusively focusing on individual level mediators, there is an urgent need to shift to integrative approaches, which combine individual level change strategies with contextual level change approaches in the design and implementation of interventional strategies to fight against HIV/AIDS. PMID:23569636
1997-08-22
A former Pinellas County, FL public health worker, [name removed], is charged with using a government AIDS surveillance database for his own personal dating scheme. He kept the county health department records on his own laptop computer and used the information to screen potential dates for himself and his friends. [Name removed] filed a pretrial free speech argument contending that his First Amendment rights were being violated. The Pinellas County judge dismissed that argument, clearing the way for a September trial. [Name removed] could face a year in prison on a first-degree misdemeanor charge.
A wide bandwidth electrostatic field sensor for lightning research
NASA Technical Reports Server (NTRS)
Zaepfel, K. P.
1986-01-01
Data obtained from UHF Radar observation of direct-lightning strikes to the NASA F-106B airplane have indicated that most of the 690 strikes acquired during direct-strike lightning tests were triggered by the aircraft. As an aid in understanding the triggered lightning process, a wide bandwidth electric field measuring system was designed for the F-106B by implementing a clamped-detection signal processing concept originated at the Air Force Cambridge Research Lab in 1953. The detection scheme combines the signals from complementary stator pairs clamped to zero volts at the exact moment when each stator pair is maximally shielded by the rotor, a process that restores the dc level lost by the charge amplifier. The new system was implemented with four shutter-type field mills located at strategic points on the airplane. The bandwidth of the new system was determined in the laboratory to be from dc to over 100 Hz, whereas past designs had upper limits of 10 Hz to 100 Hz. To obtain the undisturbed electric field vector and total aircraft charge, the airborne field mill system is calibrated by using techniques involving results from ground and flight calibrations of the F-106B, laboratory tests of a metallized model, and a finite-difference time-domain electromagnetic computer code.
A wide bandwidth electrostatic field sensor for lightning research
NASA Technical Reports Server (NTRS)
Zaepfel, Klaus P.
1989-01-01
Data obtained from UHF radar observation of direct-lightning strikes to the NASA F-106B aircraft have indicated that most of the 690 strikes acquired during direct-strike lightning tests were triggered by the aircraft. As an aid in understanding the triggered lightning process, a wide bandwidth electric field measuring system was designed for the F-106B by implementing a clamped-detection signal processing concept originated at the Air Force Cambridge Research Lab in 1953. The detection scheme combines the signals from complementary stator pairs clamped to zero bolts at the exact moment when each stator pair is maximally shielded by the rotor, a process that restores the dc level lost by the charge amplifier. The system was implemented with four shutter-type field mills located at strategic points on the aircraft. The bandwidth of the system was determined in the laboratory to be from dc to over 100 Hz, whereas past designs had upper limits of 10 to 100 Hz. To obtain the undisturbed electric field vector and total aircraft charge, the airborne field mill system is calibrated by using techniques involving results from ground and flight calibrations of the F-106B, laboratory tests of a metallized model, and a finite difference time-domain electromagnetic computer code.
Klauser, A S; Franz, M; Bellmann Weiler, R; Gruber, J; Hartig, F; Mur, E; Wick, M C; Jaschke, W
2011-12-01
To compare joint inflammation assessment using subjective grading of power Doppler ultrasonography (PDUS) and contrast-enhanced ultrasonography (CEUS) versus computer-aided objective CEUS quantification. 37 joints of 28 patients with arthritis of different etiologies underwent B-mode ultrasonography, PDUS, and CEUS using a second-generation contrast agent. Synovial thickness, extent of vascularized pannus and intensity of vascularization were included in a 4-point PDUS and CEUS grading system. Subjective CEUS and PDUS scores were compared to computer-aided objective CEUS quantification using Qontrast® software for the calculation of the signal intensity (SI) and the ratio of SI for contrast enhancement. The interobserver agreement for subjective scoring was good to excellent (κ = 0.8 - 1.0; P < 0.0001). Computer-aided objective CEUS quantification correlated statistically significantly with subjective CEUS (P < 0.001) and PDUS grading (P < 0.05). The Qontrast® SI ratio correlated with subjective CEUS (P < 0.02) and PDUS grading (P < 0.03). Clinical activity did not correlate with vascularity or synovial thickening (P = N. S.) and no correlation between synovial thickening and vascularity extent could be found, neither using PDUS nor CEUS (P = N. S.). Both subjective CEUS grading and objective CEUS quantification are valuable for assessing joint vascularity in arthritis and computer-aided CEUS quantification may be a suitable objective tool for therapy follow-up in arthritis. © Georg Thieme Verlag KG Stuttgart · New York.
Loan, Nazir A; Parah, Shabir A; Sheikh, Javaid A; Akhoon, Jahangir A; Bhat, Ghulam M
2017-09-01
A high capacity and semi-reversible data hiding scheme based on Pixel Repetition Method (PRM) and hybrid edge detection for scalable medical images has been proposed in this paper. PRM has been used to scale up the small sized image (seed image) and hybrid edge detection ensures that no important edge information is missed. The scaled up version of seed image has been divided into 2×2 non overlapping blocks. In each block there is one seed pixel whose status decides the number of bits to be embedded in the remaining three pixels of that block. The Electronic Patient Record (EPR)/data have been embedded by using Least Significant and Intermediate Significant Bit Substitution (ISBS). The RC4 encryption has been used to add an additional security layer for embedded EPR/data. The proposed scheme has been tested for various medical and general images and compared with some state of art techniques in the field. The experimental results reveal that the proposed scheme besides being semi-reversible and computationally efficient is capable of handling high payload and as such can be used effectively for electronic healthcare applications. Copyright © 2017. Published by Elsevier Inc.
Design of Provider-Provisioned Website Protection Scheme against Malware Distribution
NASA Astrophysics Data System (ADS)
Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka
Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.
Fractal analysis of bone structure with applications to osteoporosis and microgravity effects
NASA Astrophysics Data System (ADS)
Acharya, Raj S.; LeBlanc, Adrian; Shackelford, Linda; Swarnakar, Vivek; Krishnamurthy, Ram; Hausman, E.; Lin, Chin-Shoou
1995-05-01
We characterize the trabecular structure with the aid of fractal dimension. We use alternating sequential filters (ASF) to generate a nonlinear pyramid for fractal dimension computations. We do not make any assumptions of the statistical distributions of the underlying fractal bone structure. The only assumption of our scheme is the rudimentary definition of self-similarity. This allows us the freedom of not being constrained by statistical estimation schemes. With mathematical simulations, we have shown that the ASF methods outperform other existing methods for fractal dimension estimation. We have shown that the fractal dimension remains the same when computed with both the x-ray images and the MRI images of the patella. We have shown that the fractal dimension of osteoporotic subjects is lower than that of the normal subjects. In animal models, we have shown that the fractal dimension of osteoporotic rats was lower than that of the normal rats. In a 17 week bedrest study, we have shown that the subject's prebedrest fractal dimension is higher than that of the postbedrest fractal dimension.
Fractal analysis of bone structure with applications to osteoporosis and microgravity effects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acharya, R.S.; Swarnarkar, V.; Krishnamurthy, R.
1995-12-31
The authors characterize the trabecular structure with the aid of fractal dimension. The authors use Alternating Sequential filters to generate a nonlinear pyramid for fractal dimension computations. The authors do not make any assumptions of the statistical distributions of the underlying fractal bone structure. The only assumption of the scheme is the rudimentary definition of self similarity. This allows them the freedom of not being constrained by statistical estimation schemes. With mathematical simulations, the authors have shown that the ASF methods outperform other existing methods for fractal dimension estimation. They have shown that the fractal dimension remains the same whenmore » computed with both the X-Ray images and the MRI images of the patella. They have shown that the fractal dimension of osteoporotic subjects is lower than that of the normal subjects. In animal models, the authors have shown that the fractal dimension of osteoporotic rats was lower than that of the normal rats. In a 17 week bedrest study, they have shown that the subject`s prebedrest fractal dimension is higher than that of the postbedrest fractal dimension.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.
Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less
The "EyeCane", a new electronic travel aid for the blind: Technology, behavior & swift learning.
Maidenbaum, Shachar; Hanassy, Shlomi; Abboud, Sami; Buchs, Galit; Chebat, Daniel-Robert; Levy-Tzedek, Shelly; Amedi, Amir
2014-01-01
Independent mobility is one of the most pressing problems facing people who are blind. We present the EyeCane, a new mobility aid aimed at increasing perception of environment beyond what is provided by the traditional White Cane for tasks such as distance estimation, navigation and obstacle detection. The "EyeCane" enhances the traditional White Cane by using tactile and auditory output to increase detectable distance and angles. It circumvents the technical pitfalls of other devices, such as weight, short battery life, complex interface schemes, and slow learning curve. It implements multiple beams to enables detection of obstacles at different heights, and narrow beams to provide active sensing that can potentially increase the user's spatial perception of the environment. Participants were tasked with using the EyeCane for several basic tasks with minimal training. Blind and blindfolded-sighted participants were able to use the EyeCane successfully for distance estimation, simple navigation and simple obstacle detection after only several minutes of training. These results demonstrate the EyeCane's potential for mobility rehabilitation. The short training time is especially important since available mobility training resources are limited, not always available, and can be quite expensive and/or entail long waiting periods.
Analog Computer-Aided Detection (CAD) information can be more effective than binary marks.
Cunningham, Corbin A; Drew, Trafton; Wolfe, Jeremy M
2017-02-01
In socially important visual search tasks, such as baggage screening and diagnostic radiology, experts miss more targets than is desirable. Computer-aided detection (CAD) programs have been developed specifically to improve performance in these professional search tasks. For example, in breast cancer screening, many CAD systems are capable of detecting approximately 90% of breast cancer, with approximately 0.5 false-positive detections per image. Nevertheless, benefits of CAD in clinical settings tend to be small (Birdwell, 2009) or even absent (Meziane et al., 2011; Philpotts, 2009). The marks made by a CAD system can be "binary," giving the same signal to any location where the signal is above some threshold. Alternatively, a CAD system presents an analog signal that reflects strength of the signal at a location. In the experiments reported, we compare analog and binary CAD presentations using nonexpert observers and artificial stimuli defined by two noisy signals: a visible color signal and an "invisible" signal that informed our simulated CAD system. We found that analog CAD generally yielded better overall performance than binary CAD. The analog benefit is similar at high and low target prevalence. Our data suggest that the form of the CAD signal can directly influence performance. Analog CAD may allow the computer to be more helpful to the searcher.
Khadke, Piyush; Patne, Nita; Singh, Arvind; Shinde, Gulab
2016-01-01
In this article, a novel and accurate scheme for fault detection, classification and fault distance estimation for a fixed series compensated transmission line is proposed. The proposed scheme is based on artificial neural network (ANN) and metal oxide varistor (MOV) energy, employing Levenberg-Marquardt training algorithm. The novelty of this scheme is the use of MOV energy signals of fixed series capacitors (FSC) as input to train the ANN. Such approach has never been used in any earlier fault analysis algorithms in the last few decades. Proposed scheme uses only single end measurement energy signals of MOV in all the 3 phases over one cycle duration from the occurrence of a fault. Thereafter, these MOV energy signals are fed as input to ANN for fault distance estimation. Feasibility and reliability of the proposed scheme have been evaluated for all ten types of fault in test power system model at different fault inception angles over numerous fault locations. Real transmission system parameters of 3-phase 400 kV Wardha-Aurangabad transmission line (400 km) with 40 % FSC at Power Grid Wardha Substation, India is considered for this research. Extensive simulation experiments show that the proposed scheme provides quite accurate results which demonstrate complete protection scheme with high accuracy, simplicity and robustness.
NDE: A key to engine rotor life prediction
NASA Technical Reports Server (NTRS)
Doherty, J. E.
1977-01-01
A key ingredient in the establishment of safe life times for critical components is the means of reliably detecting flaws which may potentially exist. Currently used nondestructive evaluation procedures are successful in detecting life limiting defects; however, the development of automated and computer aided NDE technology permits even greater assurance of flight safety.
Wein, Wolfgang; Karamalis, Athanasios; Baumgartner, Adrian; Navab, Nassir
2015-06-01
The transfer of preoperative CT data into the tracking system coordinates within an operating room is of high interest for computer-aided orthopedic surgery. In this work, we introduce a solution for intra-operative ultrasound-CT registration of bones. We have developed methods for fully automatic real-time bone detection in ultrasound images and global automatic registration to CT. The bone detection algorithm uses a novel bone-specific feature descriptor and was thoroughly evaluated on both in-vivo and ex-vivo data. A global optimization strategy aligns the bone surface, followed by a soft tissue aware intensity-based registration to provide higher local registration accuracy. We evaluated the system on femur, tibia and fibula anatomy in a cadaver study with human legs, where magnetically tracked bone markers were implanted to yield ground truth information. An overall median system error of 3.7 mm was achieved on 11 datasets. Global and fully automatic registration of bones aquired with ultrasound to CT is feasible, with bone detection and tracking operating in real time for immediate feedback to the surgeon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, Arijit K., E-mail: akde@lbl.gov; Fleming, Graham R., E-mail: grfleming@lbl.gov; Department of Chemistry, University of California at Berkeley, Berkeley, California 94702
2014-05-21
We present a novel experimental scheme for two-dimensional fluorescence-detected coherent spectroscopy (2D-FDCS) using a non-collinear beam geometry with the aid of “confocal imaging” of dynamic (population) grating and 27-step phase-cycling to extract the signal. This arrangement obviates the need for distinct experimental designs for previously developed transmission detected non-collinear two-dimensional coherent spectroscopy (2D-CS) and collinear 2D-FDCS. We also describe a novel method for absolute phasing of the 2D spectrum. We apply this method to record 2D spectra of a fluorescent dye in solution at room temperature and observe “spectral diffusion.”.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou
2006-03-01
Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.
Phase-locked-loop interferometry applied to aspheric testing with a computer-stored compensator.
Servin, M; Malacara, D; Rodriguez-Vera, R
1994-05-01
A recently developed technique for continuous-phase determination of interferograms with a digital phase-locked loop (PLL) is applied to the null testing of aspheres. Although this PLL demodulating scheme is also a synchronous or direct interferometric technique, the separate unwrapping process is not explicitly required. The unwrapping and the phase-detection processes are achieved simultaneously within the PLL. The proposed method uses a computer-generated holographic compensator. The holographic compensator does not need to be printed out by any means; it is calculated and used from the computer. This computer-stored compensator is used as the reference signal to phase demodulate a sample interferogram obtained from the asphere being tested. Consequently the demodulated phase contains information about the wave-front departures from the ideal computer-stored aspheric interferogram. Wave-front differences of ~ 1 λ are handled easily by the proposed PLL scheme. The maximum recorded frequency in the template's interferogram as well as in the sampled interferogram are assumed to be below the Nyquist frequency.
Expert systems for automated maintenance of a Mars oxygen production system
NASA Technical Reports Server (NTRS)
Ash, Robert L.; Huang, Jen-Kuang; Ho, Ming-Tsang
1989-01-01
A prototype expert system was developed for maintaining autonomous operation of a Mars oxygen production system. Normal operation conditions and failure modes according to certain desired criteria are tested and identified. Several schemes for failure detection and isolation using forward chaining, backward chaining, knowledge-based and rule-based are devised to perform several housekeeping functions. These functions include self-health checkout, an emergency shut down program, fault detection and conventional control activities. An effort was made to derive the dynamic model of the system using Bond-Graph technique in order to develop the model-based failure detection and isolation scheme by estimation method. Finally, computer simulations and experimental results demonstrated the feasibility of the expert system and a preliminary reliability analysis for the oxygen production system is also provided.
Gur, Sourav; Frantziskonis, George N.; Univ. of Arizona, Tucson, AZ; ...
2017-02-16
Here, we report results from a numerical study of multi-time-scale bistable dynamics for CO oxidation on a catalytic surface in a flowing, well-mixed gas stream. The problem is posed in terms of surface and gas-phase submodels that dynamically interact in the presence of stochastic perturbations, reflecting the impact of molecular-scale fluctuations on the surface and turbulence in the gas. Wavelet-based methods are used to encode and characterize the temporal dynamics produced by each submodel and detect the onset of sudden state shifts (bifurcations) caused by nonlinear kinetics. When impending state shifts are detected, a more accurate but computationally expensive integrationmore » scheme can be used. This appears to make it possible, at least in some cases, to decrease the net computational burden associated with simulating multi-time-scale, nonlinear reacting systems by limiting the amount of time in which the more expensive integration schemes are required. Critical to achieving this is being able to detect unstable temporal transitions such as the bistable shifts in the example problem considered here. Lastly, our results indicate that a unique wavelet-based algorithm based on the Lipschitz exponent is capable of making such detections, even under noisy conditions, and may find applications in critical transition detection problems beyond catalysis.« less
NMRPipe: a multidimensional spectral processing system based on UNIX pipes.
Delaglio, F; Grzesiek, S; Vuister, G W; Zhu, G; Pfeifer, J; Bax, A
1995-11-01
The NMRPipe system is a UNIX software environment of processing, graphics, and analysis tools designed to meet current routine and research-oriented multidimensional processing requirements, and to anticipate and accommodate future demands and developments. The system is based on UNIX pipes, which allow programs running simultaneously to exchange streams of data under user control. In an NMRPipe processing scheme, a stream of spectral data flows through a pipeline of processing programs, each of which performs one component of the overall scheme, such as Fourier transformation or linear prediction. Complete multidimensional processing schemes are constructed as simple UNIX shell scripts. The processing modules themselves maintain and exploit accurate records of data sizes, detection modes, and calibration information in all dimensions, so that schemes can be constructed without the need to explicitly define or anticipate data sizes or storage details of real and imaginary channels during processing. The asynchronous pipeline scheme provides other substantial advantages, including high flexibility, favorable processing speeds, choice of both all-in-memory and disk-bound processing, easy adaptation to different data formats, simpler software development and maintenance, and the ability to distribute processing tasks on multi-CPU computers and computer networks.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki
2009-02-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Recent development on computer aided tissue engineering--a review.
Sun, Wei; Lal, Pallavi
2002-02-01
The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.
Space shuttle post-entry and landing analysis. Volume 1: Candidate system evaluations
NASA Technical Reports Server (NTRS)
Crawford, B. S.; Duiven, E. M.
1973-01-01
The general purpose of this study is to aid in the evaluation and design of multi-sensor navigation schemes proposed for the orbiter. The scope of the effort is limited to the post-entry, energy management, and approach and landing mission phases. One candidate system based on conventional navigation aids is illustrated including two DME (Distance Measuring Equipment) stations and ILS (Instrument Landing System) glide slope and localizer antennas. Some key elements of the system not shown are the onboard IMUs (Inertial Measurement Units), altimeters, and a computer. The latter is programmed to mix together (filter) the IMU data and the externally-derived data. A completely automatic, all-weather landing capability is required. Since no air-breathing engines will be carried on orbital flights, there will be no chance to go around and try again following a missed approach.
NASA Astrophysics Data System (ADS)
Semplice, Matteo; Loubère, Raphaël
2018-02-01
In this paper we propose a third order accurate finite volume scheme based on a posteriori limiting of polynomial reconstructions within an Adaptive-Mesh-Refinement (AMR) simulation code for hydrodynamics equations in 2D. The a posteriori limiting is based on the detection of problematic cells on a so-called candidate solution computed at each stage of a third order Runge-Kutta scheme. Such detection may include different properties, derived from physics, such as positivity, from numerics, such as a non-oscillatory behavior, or from computer requirements such as the absence of NaN's. Troubled cell values are discarded and re-computed starting again from the previous time-step using a more dissipative scheme but only locally, close to these cells. By locally decrementing the degree of the polynomial reconstructions from 2 to 0 we switch from a third-order to a first-order accurate but more stable scheme. The entropy indicator sensor is used to refine/coarsen the mesh. This sensor is also employed in an a posteriori manner because if some refinement is needed at the end of a time step, then the current time-step is recomputed with the refined mesh, but only locally, close to the new cells. We show on a large set of numerical tests that this a posteriori limiting procedure coupled with the entropy-based AMR technology can maintain not only optimal accuracy on smooth flows but also stability on discontinuous profiles such as shock waves, contacts, interfaces, etc. Moreover numerical evidences show that this approach is at least comparable in terms of accuracy and cost to a more classical CWENO approach within the same AMR context.
Arabaci, Murat; Djordjevic, Ivan B; Saunders, Ross; Marcoccia, Roberto M
2010-02-01
In order to achieve high-speed transmission over optical transport networks (OTNs) and maximize its throughput, we propose using a rate-adaptive polarization-multiplexed coded multilevel modulation with coherent detection based on component non-binary quasi-cyclic (QC) LDPC codes. Compared to prior-art bit-interleaved LDPC-coded modulation (BI-LDPC-CM) scheme, the proposed non-binary LDPC-coded modulation (NB-LDPC-CM) scheme not only reduces latency due to symbol- instead of bit-level processing but also provides either impressive reduction in computational complexity or striking improvements in coding gain depending on the constellation size. As the paper presents, compared to its prior-art binary counterpart, the proposed NB-LDPC-CM scheme addresses the needs of future OTNs, which are achieving the target BER performance and providing maximum possible throughput both over the entire lifetime of the OTN, better.
2012-02-29
couples the estimation scheme with the computational scheme, using one to enhance the other. Numerically, this switching changes several of the matrices...2011. 11. M.A. Demetriou, Enforcing and enhancing consensus of spatially distributed filters utilizing mobile sensor networks, Proceedings of the 49th...expected May, 2012. References [1] J. H. Seinfeld and S. N. Pandis, Atmospheric Chemistry and Physics: From Air Pollution to Climate Change. New York
A ROC-based feature selection method for computer-aided detection and diagnosis
NASA Astrophysics Data System (ADS)
Wang, Songyuan; Zhang, Guopeng; Liao, Qimei; Zhang, Junying; Jiao, Chun; Lu, Hongbing
2014-03-01
Image-based computer-aided detection and diagnosis (CAD) has been a very active research topic aiming to assist physicians to detect lesions and distinguish them from benign to malignant. However, the datasets fed into a classifier usually suffer from small number of samples, as well as significantly less samples available in one class (have a disease) than the other, resulting in the classifier's suboptimal performance. How to identifying the most characterizing features of the observed data for lesion detection is critical to improve the sensitivity and minimize false positives of a CAD system. In this study, we propose a novel feature selection method mR-FAST that combines the minimal-redundancymaximal relevance (mRMR) framework with a selection metric FAST (feature assessment by sliding thresholds) based on the area under a ROC curve (AUC) generated on optimal simple linear discriminants. With three feature datasets extracted from CAD systems for colon polyps and bladder cancer, we show that the space of candidate features selected by mR-FAST is more characterizing for lesion detection with higher AUC, enabling to find a compact subset of superior features at low cost.
NASA Astrophysics Data System (ADS)
Tian, Yuexin; Gao, Kun; Liu, Ying; Han, Lu
2015-08-01
Aiming at the nonlinear and non-Gaussian features of the real infrared scenes, an optimal nonlinear filtering based algorithm for the infrared dim target tracking-before-detecting application is proposed. It uses the nonlinear theory to construct the state and observation models and uses the spectral separation scheme based Wiener chaos expansion method to resolve the stochastic differential equation of the constructed models. In order to improve computation efficiency, the most time-consuming operations independent of observation data are processed on the fore observation stage. The other observation data related rapid computations are implemented subsequently. Simulation results show that the algorithm possesses excellent detection performance and is more suitable for real-time processing.
NASA Astrophysics Data System (ADS)
Huang, Jia-Yann; Kao, Pan-Fu; Chen, Yung-Sheng
2007-06-01
Adjustment of brightness and contrast in nuclear medicine whole body bone scan images may confuse nuclear medicine physicians when identifying small bone lesions as well as making the identification of subtle bone lesion changes in sequential studies difficult. In this study, we developed a computer-aided diagnosis system, based on the fuzzy sets histogram thresholding method and anatomical knowledge-based image segmentation method that was able to analyze and quantify raw image data and identify the possible location of a lesion. To locate anatomical reference points, the fuzzy sets histogram thresholding method was adopted as a first processing stage to suppress the soft tissue in the bone images. Anatomical knowledge-based image segmentation method was then applied to segment the skeletal frame into different regions of homogeneous bones. For the different segmented bone regions, the lesion thresholds were set at different cut-offs. To obtain lesion thresholds in different segmented regions, the ranges and standard deviations of the image's gray-level distribution were obtained from 100 normal patients' whole body bone images and then, another 62 patients' images were used for testing. The two groups of images were independent. The sensitivity and the mean number of false lesions detected were used as performance indices to evaluate the proposed system. The overall sensitivity of the system is 92.1% (222 of 241) and 7.58 false detections per patient scan image. With a high sensitivity and an acceptable false lesions detection rate, this computer-aided automatic lesion detection system is demonstrated as useful and will probably in the future be able to help nuclear medicine physicians to identify possible bone lesions.
NASA Technical Reports Server (NTRS)
Sjoegreen, B.; Yee, H. C.
2001-01-01
The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these sensors are scheme independent and can be stand alone options for numerical algorithm other than the Yee et al. scheme.
Sugimoto, Katsutoshi; Shiraishi, Junji; Moriyasu, Fuminori; Doi, Kunio
2009-04-01
To develop a computer-aided diagnostic (CAD) scheme for classifying focal liver lesions (FLLs) by use of physicians' subjective classification of echogenic patterns of FLLs on baseline and contrast-enhanced ultrasonography (US). A total of 137 hepatic lesions in 137 patients were evaluated with B-mode and NC100100 (Sonazoid)-enhanced pulse-inversion US; lesions included 74 hepatocellular carcinomas (HCCs) (23: well-differentiated, 36: moderately differentiated, 15: poorly differentiated HCCs), 33 liver metastases, and 30 liver hemangiomas. Three physicians evaluated single images at B-mode and arterial phases with a cine mode. Physicians were asked to classify each lesion into one of eight B-mode and one of eight enhancement patterns, but did not make a diagnosis. To classify five types of FLLs, we employed a decision tree model with four decision nodes and four artificial neural networks (ANNs). The results of the physicians' pattern classifications were used successively for four different ANNs in making decisions at each of the decision nodes in the decision tree model. The classification accuracies for the 137 FLLs were 84.8% for metastasis, 93.3% for hemangioma, and 98.6% for all HCCs. In addition, the classification accuracies for histological differentiation types of HCCs were 65.2% for well-differentiated HCC, 41.7% for moderately differentiated HCC, and 80.0% for poorly differentiated HCC. This CAD scheme has the potential to improve the diagnostic accuracy of liver lesions. However, the accuracy in the histologic differential diagnosis of HCC based on baseline and contrast-enhanced US is still limited.
Computer-aided head film analysis: the University of California San Francisco method.
Baumrind, S; Miller, D M
1980-07-01
Computer technology is already assuming an important role in the management of orthodontic practices. The next 10 years are likely to see expansion in computer usage into the areas of diagnosis, treatment planning, and treatment-record keeping. In the areas of diagnosis and treatment planning, one of the first problems to be attacked will be the automation of head film analysis. The problems of constructing computer-aided systems for this purpose are considered herein in the light of the authors' 10 years of experience in developing a similar system for research purposes. The need for building in methods for automatic detection and correction of gross errors is discussed and the authors' method for doing so is presented. The construction of a rudimentary machine-readable data base for research and clinical purposes is described.
Annular tautomerism: experimental observations and quantum mechanics calculations.
Cruz-Cabeza, Aurora J; Schreyer, Adrian; Pitt, William R
2010-06-01
The use of MP2 level quantum mechanical (QM) calculations on isolated heteroaromatic ring systems for the prediction of the tautomeric propensities of whole molecules in a crystalline environment was examined. A Polarisable Continuum Model was used in the calculations to account for environment effects on the tautomeric relative stabilities. The calculated relative energies of tautomers were compared to relative abundances within the Cambridge Structural Database (CSD) and the Protein Data Bank (PDB). The work was focussed on 84 annular tautomeric forms of 34 common ring systems. Good agreement was found between the calculations and the experimental data even if the quantity of these data was limited in many cases. The QM results were compared to those produced by much faster semiempirical calculations. In a search for other sources of the useful experimental data, the relative numbers of known compounds in which prototropic positions were often substituted by heavy atoms were also analysed. A scheme which groups all annular tautomeric transformations into 10 classes was developed. The scheme was designed to encompass a comprehensive set of known and theoretically possible tautomeric ring systems generated as part of a previous study. General trends across analogous ring systems were detected as a result. The calculations and statistics collected on crystallographic data as well as the general trends observed should be useful for the better modelling of annular tautomerism in the applications such as computer-aided drug design, small molecule crystal structure prediction, the naming of compounds and the interpretation of protein-small molecule crystal structures.
Annular tautomerism: experimental observations and quantum mechanics calculations
NASA Astrophysics Data System (ADS)
Cruz-Cabeza, Aurora J.; Schreyer, Adrian; Pitt, William R.
2010-06-01
The use of MP2 level quantum mechanical (QM) calculations on isolated heteroaromatic ring systems for the prediction of the tautomeric propensities of whole molecules in a crystalline environment was examined. A Polarisable Continuum Model was used in the calculations to account for environment effects on the tautomeric relative stabilities. The calculated relative energies of tautomers were compared to relative abundances within the Cambridge Structural Database (CSD) and the Protein Data Bank (PDB). The work was focussed on 84 annular tautomeric forms of 34 common ring systems. Good agreement was found between the calculations and the experimental data even if the quantity of these data was limited in many cases. The QM results were compared to those produced by much faster semiempirical calculations. In a search for other sources of the useful experimental data, the relative numbers of known compounds in which prototropic positions were often substituted by heavy atoms were also analysed. A scheme which groups all annular tautomeric transformations into 10 classes was developed. The scheme was designed to encompass a comprehensive set of known and theoretically possible tautomeric ring systems generated as part of a previous study. General trends across analogous ring systems were detected as a result. The calculations and statistics collected on crystallographic data as well as the general trends observed should be useful for the better modelling of annular tautomerism in the applications such as computer-aided drug design, small molecule crystal structure prediction, the naming of compounds and the interpretation of protein—small molecule crystal structures.
Cardiac arrhythmia beat classification using DOST and PSO tuned SVM.
Raj, Sandeep; Ray, Kailash Chandra; Shankar, Om
2016-11-01
The increase in the number of deaths due to cardiovascular diseases (CVDs) has gained significant attention from the study of electrocardiogram (ECG) signals. These ECG signals are studied by the experienced cardiologist for accurate and proper diagnosis, but it becomes difficult and time-consuming for long-term recordings. Various signal processing techniques are studied to analyze the ECG signal, but they bear limitations due to the non-stationary behavior of ECG signals. Hence, this study aims to improve the classification accuracy rate and provide an automated diagnostic solution for the detection of cardiac arrhythmias. The proposed methodology consists of four stages, i.e. filtering, R-peak detection, feature extraction and classification stages. In this study, Wavelet based approach is used to filter the raw ECG signal, whereas Pan-Tompkins algorithm is used for detecting the R-peak inside the ECG signal. In the feature extraction stage, discrete orthogonal Stockwell transform (DOST) approach is presented for an efficient time-frequency representation (i.e. morphological descriptors) of a time domain signal and retains the absolute phase information to distinguish the various non-stationary behavior ECG signals. Moreover, these morphological descriptors are further reduced in lower dimensional space by using principal component analysis and combined with the dynamic features (i.e based on RR-interval of the ECG signals) of the input signal. This combination of two different kinds of descriptors represents each feature set of an input signal that is utilized for classification into subsequent categories by employing PSO tuned support vector machines (SVM). The proposed methodology is validated on the baseline MIT-BIH arrhythmia database and evaluated under two assessment schemes, yielding an improved overall accuracy of 99.18% for sixteen classes in the category-based and 89.10% for five classes (mapped according to AAMI standard) in the patient-based assessment scheme respectively to the state-of-art diagnosis. The results reported are further compared to the existing methodologies in literature. The proposed feature representation of cardiac signals based on symmetrical features along with PSO based optimization technique for the SVM classifier reported an improved classification accuracy in both the assessment schemes evaluated on the benchmark MIT-BIH arrhythmia database and hence can be utilized for automated computer-aided diagnosis of cardiac arrhythmia beats. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Jiamin; Wang, Shijun; Kabadi, Suraj; Summers, Ronald M.
2009-02-01
CT colonography (CTC) is a feasible and minimally invasive method for the detection of colorectal polyps and cancer screening. Computer-aided detection (CAD) of polyps has improved consistency and sensitivity of virtual colonoscopy interpretation and reduced interpretation burden. A CAD system typically consists of four stages: (1) image preprocessing including colon segmentation; (2) initial detection generation; (3) feature selection; and (4) detection classification. In our experience, three existing problems limit the performance of our current CAD system. First, highdensity orally administered contrast agents in fecal-tagging CTC have scatter effects on neighboring tissues. The scattering manifests itself as an artificial elevation in the observed CT attenuation values of the neighboring tissues. This pseudo-enhancement phenomenon presents a problem for the application of computer-aided polyp detection, especially when polyps are submerged in the contrast agents. Second, general kernel approach for surface curvature computation in the second stage of our CAD system could yield erroneous results for thin structures such as small (6-9 mm) polyps and for touching structures such as polyps that lie on haustral folds. Those erroneous curvatures will reduce the sensitivity of polyp detection. The third problem is that more than 150 features are selected from each polyp candidate in the third stage of our CAD system. These high dimensional features make it difficult to learn a good decision boundary for detection classification and reduce the accuracy of predictions. Therefore, an improved CAD system for polyp detection in CTC data is proposed by introducing three new techniques. First, a scale-based scatter correction algorithm is applied to reduce pseudo-enhancement effects in the image pre-processing stage. Second, a cubic spline interpolation method is utilized to accurately estimate curvatures for initial detection generation. Third, a new dimensionality reduction classifier, diffusion map and local linear embedding (DMLLE), is developed for classification and false positives (FP) reduction. Performance of the improved CAD system is evaluated and compared with our existing CAD system (without applying those techniques) using CT scans of 1186 patients. These scans are divided into a training set and a test set. The sensitivity of the improved CAD system increased 18% on training data at a rate of 5 FPs per patient and 15% on test data at a rate of 5 FPs per patient. Our results indicated that the improved CAD system achieved significantly better performance on medium-sized colonic adenomas with higher sensitivity and lower FP rate in CTC.
NASA Astrophysics Data System (ADS)
Mori, Shintaro; Hara, Takeshi; Tagami, Motoki; Muramatsu, Chicako; Kaneda, Takashi; Katsumata, Akitoshi; Fujita, Hiroshi
2013-02-01
Inflammation in paranasal sinus sometimes becomes chronic to take long terms for the treatment. The finding is important for the early treatment, but general dentists may not recognize the findings because they focus on teeth treatments. The purpose of this study was to develop a computer-aided detection (CAD) system for the inflammation in paranasal sinus on dental panoramic radiographs (DPRs) by using the mandible contour and to demonstrate the potential usefulness of the CAD system by means of receiver operating characteristic analysis. The detection scheme consists of 3 steps: 1) Contour extraction of mandible, 2) Contralateral subtraction, and 3) Automated detection. The Canny operator and active contour model were applied to extract the edge at the first step. At the subtraction step, the right region of the extracted contour image was flipped to compare with the left region. Mutual information between two selected regions was obtained to estimate the shift parameters of image registration. The subtraction images were generated based on the shift parameter. Rectangle regions of left and right paranasal sinus on the subtraction image were determined based on the size of mandible. The abnormal side of the regions was determined by taking the difference between the averages of each region. Thirteen readers were responded to all cases without and with the automated results. The averaged AUC of all readers was increased from 0.69 to 0.73 with statistical significance (p=0.032) when the automated detection results were provided. In conclusion, the automated detection method based on contralateral subtraction technique improves readers' interpretation performance of inflammation in paranasal sinus on DPRs.
Bass, Sarah Bauerle; Gordon, Thomas F.; Ruzek, Sheryl Burt; Wolak, Caitlin; Ruggieri, Dominique; Mora, Gabriella; Rovito, Michael J.; Britto, Johnson; Parameswaran, Lalitha; Abedin, Zainab; Ward, Stephanie; Paranjape, Anuradha; Lin, Karen; Meyer, Brian; Pitts, Khaliah
2017-01-01
African Americans have higher colorectal cancer (CRC) mortality than White Americans and yet have lower rates of CRC screening. Increased screening aids in early detection and higher survival rates. Coupled with low literacy rates, the burden of CRC morbidity and mortality is exacerbated in this population, making it important to develop culturally and literacy appropriate aids to help low-literacy African Americans make informed decisions about CRC screening. This article outlines the development of a low-literacy computer touch-screen colonoscopy decision aid using an innovative marketing method called perceptual mapping and message vector modeling. This method was used to mathematically model key messages for the decision aid, which were then used to modify an existing CRC screening tutorial with different messages. The final tutorial was delivered through computer touch-screen technology to increase access and ease of use for participants. Testing showed users were not only more comfortable with the touch-screen technology but were also significantly more willing to have a colonoscopy compared with a “usual care group.” Results confirm the importance of including participants in planning and that the use of these innovative mapping and message design methods can lead to significant CRC screening attitude change. PMID:23132838
Data acquisition and path selection decision making for an autonomous roving vehicle
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Shen, C. N.; Yerazunis, S. W.
1976-01-01
Problems related to the guidance of an autonomous rover for unmanned planetary exploration were investigated. Topics included in these studies were: simulation on an interactive graphics computer system of the Rapid Estimation Technique for detection of discrete obstacles; incorporation of a simultaneous Bayesian estimate of states and inputs in the Rapid Estimation Scheme; development of methods for estimating actual laser rangefinder errors and their application to date provided by Jet Propulsion Laboratory; and modification of a path selection system simulation computer code for evaluation of a hazard detection system based on laser rangefinder data.
Drew, Mark S.
2016-01-01
Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular). Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction. PMID:28096807
NASA Astrophysics Data System (ADS)
Wormanns, Dag; Fiebich, Martin; Wietholt, Christian; Diederich, Stefan; Heindel, Walter
2000-06-01
We evaluated the practical application of a Computer-Aided Diagnosis (CAD) system for viewing spiral computed tomography (CT) of the chest low-dose screening examinations which includes an automatic detection of pulmonary nodules. A UNIX- based CAD system was developed including a detection algorithm for pulmonary nodules and a user interface providing an original axial image, the same image with nodules highlighted, a thin-slab MIP, and a cine mode. As yet, 26 CT examinations with 1625 images were reviewed in a clinical setting and reported by an experienced radiologist using both the CAD system and hardcopies. The CT studies exhibited 19 nodules found on the hardcopies in consensus reporting of 2 experienced radiologists. Viewing with the CAD system was more time consuming than using hardcopies (4.16 vs. 2.92 min) due to analyzing MIP and cine mode. The algorithm detected 49% (18/37) pulmonary nodules larger than 5 mm and 30% (21/70) of all nodules. It produced an average of 6.3 false positive findings per CT study. Most of the missed nodules were adjacent to the pleura. However, the program detected 6 nodules missed by the radiologists. Automatic nodule detection increases the radiologists's awareness of pulmonary lesions. Simultaneous display of axial image and thin-slab MIP makes the radiologist more confident in diagnosis of smaller pulmonary nodules. The CAD system improves the detection of pulmonary nodules at spiral CT. Lack of sensitivity and specificity is still an issue to be addressed but does not prevent practical use.
FINDS: A fault inferring nonlinear detection system programmers manual, version 3.0
NASA Technical Reports Server (NTRS)
Lancraft, R. E.
1985-01-01
Detailed software documentation of the digital computer program FINDS (Fault Inferring Nonlinear Detection System) Version 3.0 is provided. FINDS is a highly modular and extensible computer program designed to monitor and detect sensor failures, while at the same time providing reliable state estimates. In this version of the program the FINDS methodology is used to detect, isolate, and compensate for failures in simulated avionics sensors used by the Advanced Transport Operating Systems (ATOPS) Transport System Research Vehicle (TSRV) in a Microwave Landing System (MLS) environment. It is intended that this report serve as a programmers guide to aid in the maintenance, modification, and revision of the FINDS software.
Computer aided lung cancer diagnosis with deep learning algorithms
NASA Astrophysics Data System (ADS)
Sun, Wenqing; Zheng, Bin; Qian, Wei
2016-03-01
Deep learning is considered as a popular and powerful method in pattern recognition and classification. However, there are not many deep structured applications used in medical imaging diagnosis area, because large dataset is not always available for medical images. In this study we tested the feasibility of using deep learning algorithms for lung cancer diagnosis with the cases from Lung Image Database Consortium (LIDC) database. The nodules on each computed tomography (CT) slice were segmented according to marks provided by the radiologists. After down sampling and rotating we acquired 174412 samples with 52 by 52 pixel each and the corresponding truth files. Three deep learning algorithms were designed and implemented, including Convolutional Neural Network (CNN), Deep Belief Networks (DBNs), Stacked Denoising Autoencoder (SDAE). To compare the performance of deep learning algorithms with traditional computer aided diagnosis (CADx) system, we designed a scheme with 28 image features and support vector machine. The accuracies of CNN, DBNs, and SDAE are 0.7976, 0.8119, and 0.7929, respectively; the accuracy of our designed traditional CADx is 0.7940, which is slightly lower than CNN and DBNs. We also noticed that the mislabeled nodules using DBNs are 4% larger than using traditional CADx, this might be resulting from down sampling process lost some size information of the nodules.
NASA Astrophysics Data System (ADS)
Tian, Fuyang; Cao, Dong; Dong, Xiaoning; Zhao, Xinqiang; Li, Fade; Wang, Zhonghua
2017-06-01
Behavioral features recognition was an important effect to detect oestrus and sickness in dairy herds and there is a need for heat detection aid. The detection method was based on the measure of the individual behavioural activity, standing time, and temperature of dairy using vibrational sensor and temperature sensor in this paper. The data of behavioural activity index, standing time, lying time and walking time were sent to computer by lower power consumption wireless communication system. The fast approximate K-means algorithm (FAKM) was proposed to deal the data of the sensor for behavioral features recognition. As a result of technical progress in monitoring cows using computers, automatic oestrus detection has become possible.
Foundation and methodologies in computer-aided diagnosis systems for breast cancer detection.
Jalalian, Afsaneh; Mashohor, Syamsiah; Mahmud, Rozi; Karasfi, Babak; Saripan, M Iqbal B; Ramli, Abdul Rahman B
2017-01-01
Breast cancer is the most prevalent cancer that affects women all over the world. Early detection and treatment of breast cancer could decline the mortality rate. Some issues such as technical reasons, which related to imaging quality and human error, increase misdiagnosis of breast cancer by radiologists. Computer-aided detection systems (CADs) are developed to overcome these restrictions and have been studied in many imaging modalities for breast cancer detection in recent years. The CAD systems improve radiologists' performance in finding and discriminating between the normal and abnormal tissues. These procedures are performed only as a double reader but the absolute decisions are still made by the radiologist. In this study, the recent CAD systems for breast cancer detection on different modalities such as mammography, ultrasound, MRI, and biopsy histopathological images are introduced. The foundation of CAD systems generally consist of four stages: Pre-processing, Segmentation, Feature extraction, and Classification. The approaches which applied to design different stages of CAD system are summarised. Advantages and disadvantages of different segmentation, feature extraction and classification techniques are listed. In addition, the impact of imbalanced datasets in classification outcomes and appropriate methods to solve these issues are discussed. As well as, performance evaluation metrics for various stages of breast cancer detection CAD systems are reviewed.
A computer-aided detection (CAD) system with a 3D algorithm for small acute intracranial hemorrhage
NASA Astrophysics Data System (ADS)
Wang, Ximing; Fernandez, James; Deshpande, Ruchi; Lee, Joon K.; Chan, Tao; Liu, Brent
2012-02-01
Acute Intracranial hemorrhage (AIH) requires urgent diagnosis in the emergency setting to mitigate eventual sequelae. However, experienced radiologists may not always be available to make a timely diagnosis. This is especially true for small AIH, defined as lesion smaller than 10 mm in size. A computer-aided detection (CAD) system for the detection of small AIH would facilitate timely diagnosis. A previously developed 2D algorithm shows high false positive rates in the evaluation based on LAC/USC cases, due to the limitation of setting up correct coordinate system for the knowledge-based classification system. To achieve a higher sensitivity and specificity, a new 3D algorithm is developed. The algorithm utilizes a top-hat transformation and dynamic threshold map to detect small AIH lesions. Several key structures of brain are detected and are used to set up a 3D anatomical coordinate system. A rule-based classification of the lesion detected is applied based on the anatomical coordinate system. For convenient evaluation in clinical environment, the CAD module is integrated with a stand-alone system. The CAD is evaluated by small AIH cases and matched normal collected in LAC/USC. The result of 3D CAD and the previous 2D CAD has been compared.
Bakht, Mohamadreza K; Pouladian, Majid; Mofrad, Farshid B; Honarpisheh, Hamid
2014-02-01
Quantitative analysis based on digital skin image has been proven to be helpful in dermatology. Moreover, the borders of the basal cell carcinoma (BCC) lesions have been challenging borders for the automatic detection methods. In this work, a computer-aided dermatoscopy system was proposed to enhance the clinical detection of BCC lesion borders. Fifty cases of BCC were selected and 2000 pictures were taken. The lesion images data were obtained with eight colors of flashlights and in five different lighting source to skin distances (SSDs). Then, the image-processing techniques were used for automatic detection of lesion borders. Further, the dermatologists marked the lesions on the obtained photos. Considerable differences between the obtained values referring to the photographs that were taken at super blue and aqua green color lighting were observed for most of the BCC borders. It was observed that by changing the SSD, an optimum distance could be found where that the accuracy of the detection reaches to a maximum value. This study clearly indicates that by changing SSD and lighting color, manual and automatic detection of BCC lesions borders can be enhanced. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Foundation and methodologies in computer-aided diagnosis systems for breast cancer detection
Jalalian, Afsaneh; Mashohor, Syamsiah; Mahmud, Rozi; Karasfi, Babak; Saripan, M. Iqbal B.; Ramli, Abdul Rahman B.
2017-01-01
Breast cancer is the most prevalent cancer that affects women all over the world. Early detection and treatment of breast cancer could decline the mortality rate. Some issues such as technical reasons, which related to imaging quality and human error, increase misdiagnosis of breast cancer by radiologists. Computer-aided detection systems (CADs) are developed to overcome these restrictions and have been studied in many imaging modalities for breast cancer detection in recent years. The CAD systems improve radiologists' performance in finding and discriminating between the normal and abnormal tissues. These procedures are performed only as a double reader but the absolute decisions are still made by the radiologist. In this study, the recent CAD systems for breast cancer detection on different modalities such as mammography, ultrasound, MRI, and biopsy histopathological images are introduced. The foundation of CAD systems generally consist of four stages: Pre-processing, Segmentation, Feature extraction, and Classification. The approaches which applied to design different stages of CAD system are summarised. Advantages and disadvantages of different segmentation, feature extraction and classification techniques are listed. In addition, the impact of imbalanced datasets in classification outcomes and appropriate methods to solve these issues are discussed. As well as, performance evaluation metrics for various stages of breast cancer detection CAD systems are reviewed. PMID:28435432
NASA Astrophysics Data System (ADS)
Song, Bowen; Zhang, Guopeng; Wang, Huafeng; Zhu, Wei; Liang, Zhengrong
2013-02-01
Various types of features, e.g., geometric features, texture features, projection features etc., have been introduced for polyp detection and differentiation tasks via computer aided detection and diagnosis (CAD) for computed tomography colonography (CTC). Although these features together cover more information of the data, some of them are statistically highly-related to others, which made the feature set redundant and burdened the computation task of CAD. In this paper, we proposed a new dimension reduction method which combines hierarchical clustering and principal component analysis (PCA) for false positives (FPs) reduction task. First, we group all the features based on their similarity using hierarchical clustering, and then PCA is employed within each group. Different numbers of principal components are selected from each group to form the final feature set. Support vector machine is used to perform the classification. The results show that when three principal components were chosen from each group we can achieve an area under the curve of receiver operating characteristics of 0.905, which is as high as the original dataset. Meanwhile, the computation time is reduced by 70% and the feature set size is reduce by 77%. It can be concluded that the proposed method captures the most important information of the feature set and the classification accuracy is not affected after the dimension reduction. The result is promising and further investigation, such as automatically threshold setting, are worthwhile and are under progress.
Mishra, Raghavendra; Barnwal, Amit Kumar
2015-05-01
The Telecare medical information system (TMIS) presents effective healthcare delivery services by employing information and communication technologies. The emerging privacy and security are always a matter of great concern in TMIS. Recently, Chen at al. presented a password based authentication schemes to address the privacy and security. Later on, it is proved insecure against various active and passive attacks. To erase the drawbacks of Chen et al.'s anonymous authentication scheme, several password based authentication schemes have been proposed using public key cryptosystem. However, most of them do not present pre-smart card authentication which leads to inefficient login and password change phases. To present an authentication scheme with pre-smart card authentication, we present an improved anonymous smart card based authentication scheme for TMIS. The proposed scheme protects user anonymity and satisfies all the desirable security attributes. Moreover, the proposed scheme presents efficient login and password change phases where incorrect input can be quickly detected and a user can freely change his password without server assistance. Moreover, we demonstrate the validity of the proposed scheme by utilizing the widely-accepted BAN (Burrows, Abadi, and Needham) logic. The proposed scheme is also comparable in terms of computational overheads with relevant schemes.
Program on application of communications satellites to educational development
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.
1971-01-01
Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.
Investigations into the shape-preserving interpolants using symbolic computation
NASA Technical Reports Server (NTRS)
Lam, Maria
1988-01-01
Shape representation is a central issue in computer graphics and computer-aided geometric design. Many physical phenomena involve curves and surfaces that are monotone (in some directions) or are convex. The corresponding representation problem is given some monotone or convex data, and a monotone or convex interpolant is found. Standard interpolants need not be monotone or convex even though they may match monotone or convex data. Most of the methods of investigation of this problem involve the utilization of quadratic splines or Hermite polynomials. In this investigation, a similar approach is adopted. These methods require derivative information at the given data points. The key to the problem is the selection of the derivative values to be assigned to the given data points. Schemes for choosing derivatives were examined. Along the way, fitting given data points by a conic section has also been investigated as part of the effort to study shape-preserving quadratic splines.
Particle systems for adaptive, isotropic meshing of CAD models
Levine, Joshua A.; Whitaker, Ross T.
2012-01-01
We present a particle-based approach for generating adaptive triangular surface and tetrahedral volume meshes from computer-aided design models. Input shapes are treated as a collection of smooth, parametric surface patches that can meet non-smoothly on boundaries. Our approach uses a hierarchical sampling scheme that places particles on features in order of increasing dimensionality. These particles reach a good distribution by minimizing an energy computed in 3D world space, with movements occurring in the parametric space of each surface patch. Rather than using a pre-computed measure of feature size, our system automatically adapts to both curvature as well as a notion of topological separation. It also enforces a measure of smoothness on these constraints to construct a sizing field that acts as a proxy to piecewise-smooth feature size. We evaluate our technique with comparisons against other popular triangular meshing techniques for this domain. PMID:23162181
Applicability of mathematical modeling to problems of environmental physiology
NASA Technical Reports Server (NTRS)
White, Ronald J.; Lujan, Barbara F.; Leonard, Joel I.; Srinivasan, R. Srini
1988-01-01
The paper traces the evolution of mathematical modeling and systems analysis from terrestrial research to research related to space biomedicine and back again to terrestrial research. Topics covered include: power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and, computer-aided diagnosis programs used in conjunction with a special on-line biomedical computer library.
You, Hongjian
2018-01-01
Target detection is one of the important applications in the field of remote sensing. The Gaofen-3 (GF-3) Synthetic Aperture Radar (SAR) satellite launched by China is a powerful tool for maritime monitoring. This work aims at detecting ships in GF-3 SAR images using a new land masking strategy, the appropriate model for sea clutter and a neural network as the discrimination scheme. Firstly, the fully convolutional network (FCN) is applied to separate the sea from the land. Then, by analyzing the sea clutter distribution in GF-3 SAR images, we choose the probability distribution model of Constant False Alarm Rate (CFAR) detector from K-distribution, Gamma distribution and Rayleigh distribution based on a tradeoff between the sea clutter modeling accuracy and the computational complexity. Furthermore, in order to better implement CFAR detection, we also use truncated statistic (TS) as a preprocessing scheme and iterative censoring scheme (ICS) for boosting the performance of detector. Finally, we employ a neural network to re-examine the results as the discrimination stage. Experiment results on three GF-3 SAR images verify the effectiveness and efficiency of this approach. PMID:29364194
An, Quanzhi; Pan, Zongxu; You, Hongjian
2018-01-24
Target detection is one of the important applications in the field of remote sensing. The Gaofen-3 (GF-3) Synthetic Aperture Radar (SAR) satellite launched by China is a powerful tool for maritime monitoring. This work aims at detecting ships in GF-3 SAR images using a new land masking strategy, the appropriate model for sea clutter and a neural network as the discrimination scheme. Firstly, the fully convolutional network (FCN) is applied to separate the sea from the land. Then, by analyzing the sea clutter distribution in GF-3 SAR images, we choose the probability distribution model of Constant False Alarm Rate (CFAR) detector from K-distribution, Gamma distribution and Rayleigh distribution based on a tradeoff between the sea clutter modeling accuracy and the computational complexity. Furthermore, in order to better implement CFAR detection, we also use truncated statistic (TS) as a preprocessing scheme and iterative censoring scheme (ICS) for boosting the performance of detector. Finally, we employ a neural network to re-examine the results as the discrimination stage. Experiment results on three GF-3 SAR images verify the effectiveness and efficiency of this approach.
A Weak Quantum Blind Signature with Entanglement Permutation
NASA Astrophysics Data System (ADS)
Lou, Xiaoping; Chen, Zhigang; Guo, Ying
2015-09-01
Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.
Free-Space Quantum Signatures Using Heterodyne Measurements.
Croal, Callum; Peuntinger, Christian; Heim, Bettina; Khan, Imran; Marquardt, Christoph; Leuchs, Gerd; Wallden, Petros; Andersson, Erika; Korolkova, Natalia
2016-09-02
Digital signatures guarantee the authorship of electronic communications. Currently used "classical" signature schemes rely on unproven computational assumptions for security, while quantum signatures rely only on the laws of quantum mechanics to sign a classical message. Previous quantum signature schemes have used unambiguous quantum measurements. Such measurements, however, sometimes give no result, reducing the efficiency of the protocol. Here, we instead use heterodyne detection, which always gives a result, although there is always some uncertainty. We experimentally demonstrate feasibility in a real environment by distributing signature states through a noisy 1.6 km free-space channel. Our results show that continuous-variable heterodyne detection improves the signature rate for this type of scheme and therefore represents an interesting direction in the search for practical quantum signature schemes. For transmission values ranging from 100% to 10%, but otherwise assuming an ideal implementation with no other imperfections, the signature length is shorter by a factor of 2 to 10. As compared with previous relevant experimental realizations, the signature length in this implementation is several orders of magnitude shorter.
The Personal Hearing System—A Software Hearing Aid for a Personal Communication System
NASA Astrophysics Data System (ADS)
Grimm, Giso; Guilmin, Gwénaël; Poppen, Frank; Vlaming, Marcel S. M. G.; Hohmann, Volker
2009-12-01
A concept and architecture of a personal communication system (PCS) is introduced that integrates audio communication and hearing support for the elderly and hearing-impaired through a personal hearing system (PHS). The concept envisions a central processor connected to audio headsets via a wireless body area network (WBAN). To demonstrate the concept, a prototype PCS is presented that is implemented on a netbook computer with a dedicated audio interface in combination with a mobile phone. The prototype can be used for field-testing possible applications and to reveal possibilities and limitations of the concept of integrating hearing support in consumer audio communication devices. It is shown that the prototype PCS can integrate hearing aid functionality, telephony, public announcement systems, and home entertainment. An exemplary binaural speech enhancement scheme that represents a large class of possible PHS processing schemes is shown to be compatible with the general concept. However, an analysis of hardware and software architectures shows that the implementation of a PCS on future advanced cell phone-like devices is challenging. Because of limitations in processing power, recoding of prototype implementations into fixed point arithmetic will be required and WBAN performance is still a limiting factor in terms of data rate and delay.
Detection of antipersonnel (AP) mines using mechatronics approach
NASA Astrophysics Data System (ADS)
Shahri, Ali M.; Naghdy, Fazel
1998-09-01
At present there are approximately 110 million land-mines scattered around the world in 64 countries. The clearance of these mines takes place manually. Unfortunately, on average for every 5000 mines cleared one mine clearer is killed. A Mine Detector Arm (MDA) using mechatronics approach is under development in this work. The robot arm imitates manual hand- prodding technique for mine detection. It inserts a bayonet into the soil and models the dynamics of the manipulator and environment parameters, such as stiffness variation in the soil to control the impact caused by contacting a stiff object. An explicit impact control scheme is applied as the main control scheme, while two different intelligent control methods are designed to deal with uncertainties and varying environmental parameters. Firstly, a neuro-fuzzy adaptive gain controller (NFAGC) is designed to adapt the force gain control according to the estimated environment stiffness. Then, an adaptive neuro-fuzzy plus PID controller is employed to switch from a conventional PID controller to neuro-fuzzy impact control (NFIC), when an impact is detected. The developed control schemes are validated through computer simulation and experimental work.
CCS_WHMS: A Congestion Control Scheme for Wearable Health Management System.
Kafi, Mohamed Amine; Ben Othman, Jalel; Bagaa, Miloud; Badache, Nadjib
2015-12-01
Wearable computing is becoming a more and more attracting field in the last years thanks to the miniaturisation of electronic devices. Wearable healthcare monitoring systems (WHMS) as an important client of wearable computing technology has gained a lot. Indeed, the wearable sensors and their surrounding healthcare applications bring a lot of benefits to patients, elderly people and medical staff, so facilitating their daily life quality. But from a research point of view, there is still work to accomplish in order to overcome the gap between hardware and software parts. In this paper, we target the problem of congestion control when all these healthcare sensed data have to reach the destination in a reliable manner that avoids repetitive transmission which wastes precious energy or leads to loss of important information in emergency cases, too. We propose a congestion control scheme CCS_WHMS that ensures efficient and fair data delivery while used in the body wearable system part or in the multi-hop inter bodies wearable ones to get the destination. As the congestion detection paradigm is very important in the control process, we do experimental tests to compare between state of the art congestion detection methods, using MICAz motes, in order to choose the appropriate one for our scheme.
A malware detection scheme based on mining format information.
Bai, Jinrong; Wang, Junfeng; Zou, Guozhong
2014-01-01
Malware has become one of the most serious threats to computer information system and the current malware detection technology still has very significant limitations. In this paper, we proposed a malware detection approach by mining format information of PE (portable executable) files. Based on in-depth analysis of the static format information of the PE files, we extracted 197 features from format information of PE files and applied feature selection methods to reduce the dimensionality of the features and achieve acceptable high performance. When the selected features were trained using classification algorithms, the results of our experiments indicate that the accuracy of the top classification algorithm is 99.1% and the value of the AUC is 0.998. We designed three experiments to evaluate the performance of our detection scheme and the ability of detecting unknown and new malware. Although the experimental results of identifying new malware are not perfect, our method is still able to identify 97.6% of new malware with 1.3% false positive rates.
A Malware Detection Scheme Based on Mining Format Information
Bai, Jinrong; Wang, Junfeng; Zou, Guozhong
2014-01-01
Malware has become one of the most serious threats to computer information system and the current malware detection technology still has very significant limitations. In this paper, we proposed a malware detection approach by mining format information of PE (portable executable) files. Based on in-depth analysis of the static format information of the PE files, we extracted 197 features from format information of PE files and applied feature selection methods to reduce the dimensionality of the features and achieve acceptable high performance. When the selected features were trained using classification algorithms, the results of our experiments indicate that the accuracy of the top classification algorithm is 99.1% and the value of the AUC is 0.998. We designed three experiments to evaluate the performance of our detection scheme and the ability of detecting unknown and new malware. Although the experimental results of identifying new malware are not perfect, our method is still able to identify 97.6% of new malware with 1.3% false positive rates. PMID:24991639
Crowdsourcing lung nodules detection and annotation
NASA Astrophysics Data System (ADS)
Boorboor, Saeed; Nadeem, Saad; Park, Ji Hwan; Baker, Kevin; Kaufman, Arie
2018-03-01
We present crowdsourcing as an additional modality to aid radiologists in the diagnosis of lung cancer from clinical chest computed tomography (CT) scans. More specifically, a complete work flow is introduced which can help maximize the sensitivity of lung nodule detection by utilizing the collective intelligence of the crowd. We combine the concept of overlapping thin-slab maximum intensity projections (TS-MIPs) and cine viewing to render short videos that can be outsourced as an annotation task to the crowd. These videos are generated by linearly interpolating overlapping TS-MIPs of CT slices through the depth of each quadrant of a patient's lung. The resultant videos are outsourced to an online community of non-expert users who, after a brief tutorial, annotate suspected nodules in these video segments. Using our crowdsourcing work flow, we achieved a lung nodule detection sensitivity of over 90% for 20 patient CT datasets (containing 178 lung nodules with sizes between 1-30mm), and only 47 false positives from a total of 1021 annotations on nodules of all sizes (96% sensitivity for nodules>4mm). These results show that crowdsourcing can be a robust and scalable modality to aid radiologists in screening for lung cancer, directly or in combination with computer-aided detection (CAD) algorithms. For CAD algorithms, the presented work flow can provide highly accurate training data to overcome the high false-positive rate (per scan) problem. We also provide, for the first time, analysis on nodule size and position which can help improve CAD algorithms.
Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography
Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji
2013-01-01
OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418
Dachman, Abraham H.; Wroblewski, Kristen; Vannier, Michael W.; Horne, John M.
2014-01-01
Computed tomography (CT) colonography is a screening modality used to detect colonic polyps before they progress to colorectal cancer. Computer-aided detection (CAD) is designed to decrease errors of detection by finding and displaying polyp candidates for evaluation by the reader. CT colonography CAD false-positive results are common and have numerous causes. The relative frequency of CAD false-positive results and their effect on reader performance on the basis of a 19-reader, 100-case trial shows that the vast majority of CAD false-positive results were dismissed by readers. Many CAD false-positive results are easily disregarded, including those that result from coarse mucosa, reconstruction, peristalsis, motion, streak artifacts, diverticulum, rectal tubes, and lipomas. CAD false-positive results caused by haustral folds, extracolonic candidates, diminutive lesions (<6 mm), anal papillae, internal hemorrhoids, varices, extrinsic compression, and flexural pseudotumors are almost always recognized and disregarded. The ileocecal valve and tagged stool are common sources of CAD false-positive results associated with reader false-positive results. Nondismissable CAD soft-tissue polyp candidates larger than 6 mm are another common cause of reader false-positive results that may lead to further evaluation with follow-up CT colonography or optical colonoscopy. Strategies for correctly evaluating CAD polyp candidates are important to avoid pitfalls from common sources of CAD false-positive results. ©RSNA, 2014 PMID:25384290
Analog Computer-Aided Detection (CAD) information can be more effective than binary marks
Cunningham, Corbin A.; Drew, Trafton; Wolfe, Jeremy M.
2017-01-01
In socially important visual search tasks such as baggage screening and diagnostic radiology, experts miss more targets than is desirable. Computer Aided Detection (CAD) programs have been developed specifically to help improve performance in these professional search tasks. For example, in breast cancer screening, many CAD systems are capable of detecting approximately 90% of breast cancer, with approximately 0.5 false positive detections per image. Nevertheless, benefits of CAD in clinical settings tend to be small (Birdwell, 2009) or even absent (Meziane et al., 2011; Philpotts, 2009). The marks made by a CAD system can be “Binary”, giving the same signal to any location where the signal is above some threshold. Alternatively, a CAD system present an Analog signal that reflected strength of the signal at a location. In the experiments reported here, we compare analog and binary CAD presentations using non-expert observers and artificial stimuli defined by two noisy signals: a visible color signal and an "invisible" signal that informed our simulated CAD system. We found that analog CAD generally yielded better overall performance than binary CAD. The analog benefit is similar at high and low target prevalence. Our data suggest that the form of the CAD signal can directly influence performance. Analog CAD may allow the computer to be more helpful to the searcher. PMID:27928658
A Fast Approach to Automatic Detection of Brain Lesions
Koley, Subhranil; Chakraborty, Chandan; Mainero, Caterina; Fischl, Bruce; Aganj, Iman
2017-01-01
Template matching is a popular approach to computer-aided detection of brain lesions from magnetic resonance (MR) images. The outcomes are often sufficient for localizing lesions and assisting clinicians in diagnosis. However, processing large MR volumes with three-dimensional (3D) templates is demanding in terms of computational resources, hence the importance of the reduction of computational complexity of template matching, particularly in situations in which time is crucial (e.g. emergent stroke). In view of this, we make use of 3D Gaussian templates with varying radii and propose a new method to compute the normalized cross-correlation coefficient as a similarity metric between the MR volume and the template to detect brain lesions. Contrary to the conventional fast Fourier transform (FFT) based approach, whose runtime grows as O(N logN) with the number of voxels, the proposed method computes the cross-correlation in O(N). We show through our experiments that the proposed method outperforms the FFT approach in terms of computational time, and retains comparable accuracy. PMID:29082383
The effect of feature selection methods on computer-aided detection of masses in mammograms
NASA Astrophysics Data System (ADS)
Hupse, Rianne; Karssemeijer, Nico
2010-05-01
In computer-aided diagnosis (CAD) research, feature selection methods are often used to improve generalization performance of classifiers and shorten computation times. In an application that detects malignant masses in mammograms, we investigated the effect of using a selection criterion that is similar to the final performance measure we are optimizing, namely the mean sensitivity of the system in a predefined range of the free-response receiver operating characteristics (FROC). To obtain the generalization performance of the selected feature subsets, a cross validation procedure was performed on a dataset containing 351 abnormal and 7879 normal regions, each region providing a set of 71 mass features. The same number of noise features, not containing any information, were added to investigate the ability of the feature selection algorithms to distinguish between useful and non-useful features. It was found that significantly higher performances were obtained using feature sets selected by the general test statistic Wilks' lambda than using feature sets selected by the more specific FROC measure. Feature selection leads to better performance when compared to a system in which all features were used.
Neher, Tobias; Wagener, Kirsten C; Latzel, Matthias
2017-09-01
Hearing aid (HA) users can differ markedly in their benefit from directional processing (or beamforming) algorithms. The current study therefore investigated candidacy for different bilateral directional processing schemes. Groups of elderly listeners with symmetric (N = 20) or asymmetric (N = 19) hearing thresholds for frequencies below 2 kHz, a large spread in the binaural intelligibility level difference (BILD), and no difference in age, overall degree of hearing loss, or performance on a measure of selective attention took part. Aided speech reception was measured using virtual acoustics together with a simulation of a linked pair of completely occluding behind-the-ear HAs. Five processing schemes and three acoustic scenarios were used. The processing schemes differed in the tradeoff between signal-to-noise ratio (SNR) improvement and binaural cue preservation. The acoustic scenarios consisted of a frontal target talker presented against two speech maskers from ±60° azimuth or spatially diffuse cafeteria noise. For both groups, a significant interaction between BILD, processing scheme, and acoustic scenario was found. This interaction implied that, in situations with lateral speech maskers, HA users with BILDs larger than about 2 dB profited more from preserved low-frequency binaural cues than from greater SNR improvement, whereas for smaller BILDs the opposite was true. Audiometric asymmetry reduced the influence of binaural hearing. In spatially diffuse noise, the maximal SNR improvement was generally beneficial. N 0 S π detection performance at 500 Hz predicted the benefit from low-frequency binaural cues. Together, these findings provide a basis for adapting bilateral directional processing to individual and situational influences. Further research is needed to investigate their generalizability to more realistic HA conditions (e.g., with low-frequency vent-transmitted sound). Copyright © 2017 Elsevier B.V. All rights reserved.
Integration of a CAD System Into an MDO Framework
NASA Technical Reports Server (NTRS)
Townsend, J. C.; Samareh, J. A.; Weston, R. P.; Zorumski, W. E.
1998-01-01
NASA Langley has developed a heterogeneous distributed computing environment, called the Framework for Inter-disciplinary Design Optimization, or FIDO. Its purpose has been to demonstrate framework technical feasibility and usefulness for optimizing the preliminary design of complex systems and to provide a working environment for testing optimization schemes. Its initial implementation has been for a simplified model of preliminary design of a high-speed civil transport. Upgrades being considered for the FIDO system include a more complete geometry description, required by high-fidelity aerodynamics and structures codes and based on a commercial Computer Aided Design (CAD) system. This report presents the philosophy behind some of the decisions that have shaped the FIDO system and gives a brief case study of the problems and successes encountered in integrating a CAD system into the FEDO framework.
Electron-correlated fragment-molecular-orbital calculations for biomolecular and nano systems.
Tanaka, Shigenori; Mochizuki, Yuji; Komeiji, Yuto; Okiyama, Yoshio; Fukuzawa, Kaori
2014-06-14
Recent developments in the fragment molecular orbital (FMO) method for theoretical formulation, implementation, and application to nano and biomolecular systems are reviewed. The FMO method has enabled ab initio quantum-mechanical calculations for large molecular systems such as protein-ligand complexes at a reasonable computational cost in a parallelized way. There have been a wealth of application outcomes from the FMO method in the fields of biochemistry, medicinal chemistry and nanotechnology, in which the electron correlation effects play vital roles. With the aid of the advances in high-performance computing, the FMO method promises larger, faster, and more accurate simulations of biomolecular and related systems, including the descriptions of dynamical behaviors in solvent environments. The current status and future prospects of the FMO scheme are addressed in these contexts.
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
NASA Astrophysics Data System (ADS)
Tan, Shanjuan; Feng, Feifei; Wu, Yongjun; Wu, Yiming
To develop a computer-aided diagnostic scheme by using an artificial neural network (ANN) combined with tumor markers for diagnosis of hepatic carcinoma (HCC) as a clinical assistant method. 140 serum samples (50 malignant, 40 benign and 50 normal) were analyzed for α-fetoprotein (AFP), carbohydrate antigen 125 (CA125), carcinoembryonic antigen (CEA), sialic acid (SA) and calcium (Ca). The five tumor marker values were then used as ANN inputs data. The result of ANN was compared with that of discriminant analysis by receiver operating characteristic (ROC) curve (AUC) analysis. The diagnostic accuracy of ANN and discriminant analysis among all samples of the test group was 95.5% and 79.3%, respectively. Analysis of multiple tumor markers based on ANN may be a better choice than the traditional statistical methods for differentiating HCC from benign or normal.
Miller, C; Tsoka, M G
2012-02-01
The Malawian Social Cash Transfer Scheme (SCT) is a social protection programme for ultra poor and labour-constrained households, including people living with HIV/AIDS (PLWHA). We aimed to gain insight into respondents' circumstances prior to becoming transfer beneficiaries and to examine how PLWHA used transfers to support themselves and their families. We conducted 24 semi-structured qualitative interviews with PLWHA who were also SCT beneficiaries and living in villages where the scheme was operational in 2008. Respondents were destitute and lacked food and basic necessities prior to the transfer. As cash recipients, the majority of respondents reported positive impacts on health, food security and economic well-being as well as an improved ability to care for their families. Important unanswered programmatic questions persist, such as 'What is the appropriate transfer level?' And 'Should recipients graduate from the scheme?' Moreover, the scheme's long-term sustainability is still unclear. Nevertheless, this analysis presents evidence describing how PLWHA used cash transfers to improve their situation and mitigate the impact of HIV/AIDS on families. © 2011 Blackwell Publishing Ltd.
Computer-aided diagnosis software for vulvovaginal candidiasis detection from Pap smear images.
Momenzadeh, Mohammadreza; Vard, Alireza; Talebi, Ardeshir; Mehri Dehnavi, Alireza; Rabbani, Hossein
2018-01-01
Vulvovaginal candidiasis (VVC) is a common gynecologic infection and it occurs when there is overgrowth of the yeast called Candida. VVC diagnosis is usually done by observing a Pap smear sample under a microscope and searching for the conidium and mycelium components of Candida. This manual method is time consuming, subjective and tedious. Any diagnosis tools that detect VVC, semi- or full-automatically, can be very helpful to pathologists. This article presents a computer aided diagnosis (CAD) software to improve human diagnosis of VVC from Pap smear samples. The proposed software is designed based on phenotypic and morphology features of the Candida in Pap smear sample images. This software provide a user-friendly interface which consists of a set of image processing tools and analytical results that helps to detect Candida and determine severity of illness. The software was evaluated on 200 Pap smear sample images and obtained specificity of 91.04% and sensitivity of 92.48% to detect VVC. As a result, the use of the proposed software reduces diagnostic time and can be employed as a second objective opinion for pathologists. © 2017 Wiley Periodicals, Inc.
Research on Quantum Algorithms at the Institute for Quantum Information
2009-10-17
accuracy threshold theorem for the one-way quantum computer. Their proof is based on a novel scheme, in which a noisy cluster state in three spatial...detected. The proof applies to independent stochastic noise but (in contrast to proofs of the quantum accuracy threshold theorem based on concatenated...proved quantum threshold theorems for long-range correlated non-Markovian noise, for leakage faults, for the one-way quantum computer, for postselected
Performance analysis of a cascaded coding scheme with interleaved outer code
NASA Technical Reports Server (NTRS)
Lin, S.
1986-01-01
A cascaded coding scheme for a random error channel with a bit-error rate is analyzed. In this scheme, the inner code C sub 1 is an (n sub 1, m sub 1l) binary linear block code which is designed for simultaneous error correction and detection. The outer code C sub 2 is a linear block code with symbols from the Galois field GF (2 sup l) which is designed for correcting both symbol errors and erasures, and is interleaved with a degree m sub 1. A procedure for computing the probability of a correct decoding is presented and an upper bound on the probability of a decoding error is derived. The bound provides much better results than the previous bound for a cascaded coding scheme with an interleaved outer code. Example schemes with inner codes ranging from high rates to very low rates are evaluated. Several schemes provide extremely high reliability even for very high bit-error rates say 10 to the -1 to 10 to the -2 power.
Wavelet method for CT colonography computer-aided polyp detection.
Li, Jiang; Van Uitert, Robert; Yao, Jianhua; Petrick, Nicholas; Franaszek, Marek; Huang, Adam; Summers, Ronald M
2008-08-01
Computed tomographic colonography (CTC) computer aided detection (CAD) is a new method to detect colon polyps. Colonic polyps are abnormal growths that may become cancerous. Detection and removal of colonic polyps, particularly larger ones, has been shown to reduce the incidence of colorectal cancer. While high sensitivities and low false positive rates are consistently achieved for the detection of polyps sized 1 cm or larger, lower sensitivities and higher false positive rates occur when the goal of CAD is to identify "medium"-sized polyps, 6-9 mm in diameter. Such medium-sized polyps may be important for clinical patient management. We have developed a wavelet-based postprocessor to reduce false positives for this polyp size range. We applied the wavelet-based postprocessor to CTC CAD findings from 44 patients in whom 45 polyps with sizes of 6-9 mm were found at segmentally unblinded optical colonoscopy and visible on retrospective review of the CT colonography images. Prior to the application of the wavelet-based postprocessor, the CTC CAD system detected 33 of the polyps (sensitivity 73.33%) with 12.4 false positives per patient, a sensitivity comparable to that of expert radiologists. Fourfold cross validation with 5000 bootstraps showed that the wavelet-based postprocessor could reduce the false positives by 56.61% (p <0.001), to 5.38 per patient (95% confidence interval [4.41, 6.34]), without significant sensitivity degradation (32/45, 71.11%, 95% confidence interval [66.39%, 75.74%], p=0.1713). We conclude that this wavelet-based postprocessor can substantially reduce the false positive rate of our CTC CAD for this important polyp size range.
A survey on computer aided diagnosis for ocular diseases
2014-01-01
Background Computer Aided Diagnosis (CAD), which can automate the detection process for ocular diseases, has attracted extensive attention from clinicians and researchers alike. It not only alleviates the burden on the clinicians by providing objective opinion with valuable insights, but also offers early detection and easy access for patients. Method We review ocular CAD methodologies for various data types. For each data type, we investigate the databases and the algorithms to detect different ocular diseases. Their advantages and shortcomings are analyzed and discussed. Result We have studied three types of data (i.e., clinical, genetic and imaging) that have been commonly used in existing methods for CAD. The recent developments in methods used in CAD of ocular diseases (such as Diabetic Retinopathy, Glaucoma, Age-related Macular Degeneration and Pathological Myopia) are investigated and summarized comprehensively. Conclusion While CAD for ocular diseases has shown considerable progress over the past years, the clinical importance of fully automatic CAD systems which are able to embed clinical knowledge and integrate heterogeneous data sources still show great potential for future breakthrough. PMID:25175552
Second CLIPS Conference Proceedings, volume 1
NASA Technical Reports Server (NTRS)
Giarratano, Joseph (Editor); Culbert, Christopher J. (Editor)
1991-01-01
Topics covered at the 2nd CLIPS Conference held at the Johnson Space Center, September 23-25, 1991 are given. Topics include rule groupings, fault detection using expert systems, decision making using expert systems, knowledge representation, computer aided design and debugging expert systems.
Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu
2017-10-01
We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.
Miyazaki, Yoshiaki; Tabata, Nobuyuki; Taroura, Tomomi; Shinozaki, Kenji; Kubo, Yuichiro; Tokunaga, Eriko; Taguchi, Kenichi
We propose a computer-aided diagnostic (CAD) system that uses time-intensity curves to distinguish between benign and malignant mammary tumors. Many malignant tumors show a washout pattern in time-intensity curves. Therefore, we designed a program that automatically detects the position with the strongest washout effect using the technique, such as the subtraction technique, which extracts only the washout area in the tumor, and by scanning data in 2×2 pixel region of interest (ROI). Operation of this independently developed program was verified using a phantom system that simulated tumors. In three cases of malignant tumors, the washout pattern detection rate in images with manually set ROI was ≤6%, whereas the detection rate with our novel method was 100%. In one case of a benign tumor, when the same method was used, we checked that there was no washout effect and detected the persistent pattern. Thus, the distinction between benign and malignant tumors using our method was completely consistent with the pathological diagnoses made. Our novel method is therefore effective for differentiating between benign and malignant mammary tumors in dynamic magnetic resonance images.
Computer-aided diagnosis of early knee osteoarthritis based on MRI T2 mapping.
Wu, Yixiao; Yang, Ran; Jia, Sen; Li, Zhanjun; Zhou, Zhiyang; Lou, Ting
2014-01-01
This work was aimed at studying the method of computer-aided diagnosis of early knee OA (OA: osteoarthritis). Based on the technique of MRI (MRI: Magnetic Resonance Imaging) T2 Mapping, through computer image processing, feature extraction, calculation and analysis via constructing a classifier, an effective computer-aided diagnosis method for knee OA was created to assist doctors in their accurate, timely and convenient detection of potential risk of OA. In order to evaluate this method, a total of 1380 data from the MRI images of 46 samples of knee joints were collected. These data were then modeled through linear regression on an offline general platform by the use of the ImageJ software, and a map of the physical parameter T2 was reconstructed. After the image processing, the T2 values of ten regions in the WORMS (WORMS: Whole-organ Magnetic Resonance Imaging Score) areas of the articular cartilage were extracted to be used as the eigenvalues in data mining. Then,a RBF (RBF: Radical Basis Function) network classifier was built to classify and identify the collected data. The classifier exhibited a final identification accuracy of 75%, indicating a good result of assisting diagnosis. Since the knee OA classifier constituted by a weights-directly-determined RBF neural network didn't require any iteration, our results demonstrated that the optimal weights, appropriate center and variance could be yielded through simple procedures. Furthermore, the accuracy for both the training samples and the testing samples from the normal group could reach 100%. Finally, the classifier was superior both in time efficiency and classification performance to the frequently used classifiers based on iterative learning. Thus it was suitable to be used as an aid to computer-aided diagnosis of early knee OA.
Bharti, Puja; Mittal, Deepti; Ananthasivan, Rupa
2016-04-19
Diffuse liver diseases, such as hepatitis, fatty liver, and cirrhosis, are becoming a leading cause of fatality and disability all over the world. Early detection and diagnosis of these diseases is extremely important to save lives and improve effectiveness of treatment. Ultrasound imaging, a noninvasive diagnostic technique, is the most commonly used modality for examining liver abnormalities. However, the accuracy of ultrasound-based diagnosis depends highly on expertise of radiologists. Computer-aided diagnosis systems based on ultrasound imaging assist in fast diagnosis, provide a reliable "second opinion" for experts, and act as an effective tool to measure response of treatment on patients undergoing clinical trials. In this review, we first describe appearance of liver abnormalities in ultrasound images and state the practical issues encountered in characterization of diffuse liver diseases that can be addressed by software algorithms. We then discuss computer-aided diagnosis in general with features and classifiers relevant to diffuse liver diseases. In later sections of this paper, we review the published studies and describe the key findings of those studies. A concise tabular summary comparing image database, features extraction, feature selection, and classification algorithms presented in the published studies is also exhibited. Finally, we conclude with a summary of key findings and directions for further improvements in the areas of accuracy and objectiveness of computer-aided diagnosis. © The Author(s) 2016.
Zhang, Zhi-Hui; Yang, Guang-Hong
2017-05-01
This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l 1 and H ∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Update schemes of multi-velocity floor field cellular automaton for pedestrian dynamics
NASA Astrophysics Data System (ADS)
Luo, Lin; Fu, Zhijian; Cheng, Han; Yang, Lizhong
2018-02-01
Modeling pedestrian movement is an interesting problem both in statistical physics and in computational physics. Update schemes of cellular automaton (CA) models for pedestrian dynamics govern the schedule of pedestrian movement. Usually, different update schemes make the models behave in different ways, which should be carefully recalibrated. Thus, in this paper, we investigated the influence of four different update schemes, namely parallel/synchronous scheme, random scheme, order-sequential scheme and shuffled scheme, on pedestrian dynamics. The multi-velocity floor field cellular automaton (FFCA) considering the changes of pedestrians' moving properties along walking paths and heterogeneity of pedestrians' walking abilities was used. As for parallel scheme only, the collisions detection and resolution should be considered, resulting in a great difference from any other update schemes. For pedestrian evacuation, the evacuation time is enlarged, and the difference in pedestrians' walking abilities is better reflected, under parallel scheme. In face of a bottleneck, for example a exit, using a parallel scheme leads to a longer congestion period and a more dispersive density distribution. The exit flow and the space-time distribution of density and velocity have significant discrepancies under four different update schemes when we simulate pedestrian flow with high desired velocity. Update schemes may have no influence on pedestrians in simulation to create tendency to follow others, but sequential and shuffled update scheme may enhance the effect of pedestrians' familiarity with environments.
Barchuk, A A; Podolsky, M D; Tarakanov, S A; Kotsyuba, I Yu; Gaidukov, V S; Kuznetsov, V I; Merabishvili, V M; Barchuk, A S; Levchenko, E V; Filochkina, A V; Arseniev, A I
2015-01-01
This review article analyzes data of literature devoted to the description, interpretation and classification of focal (nodal) changes in the lungs detected by computed tomography of the chest cavity. There are discussed possible criteria for determining the most likely of their character--primary and metastatic tumor processes, inflammation, scarring, and autoimmune changes, tuberculosis and others. Identification of the most characteristic, reliable and statistically significant evidences of a variety of pathological processes in the lungs including the use of modern computer-aided detection and diagnosis of sites will optimize the diagnostic measures and ensure processing of a large volume of medical data in a short time.
Vertebra identification using template matching modelmp and K-means clustering.
Larhmam, Mohamed Amine; Benjelloun, Mohammed; Mahmoudi, Saïd
2014-03-01
Accurate vertebra detection and segmentation are essential steps for automating the diagnosis of spinal disorders. This study is dedicated to vertebra alignment measurement, the first step in a computer-aided diagnosis tool for cervical spine trauma. Automated vertebral segment alignment determination is a challenging task due to low contrast imaging and noise. A software tool for segmenting vertebrae and detecting subluxations has clinical significance. A robust method was developed and tested for cervical vertebra identification and segmentation that extracts parameters used for vertebra alignment measurement. Our contribution involves a novel combination of a template matching method and an unsupervised clustering algorithm. In this method, we build a geometric vertebra mean model. To achieve vertebra detection, manual selection of the region of interest is performed initially on the input image. Subsequent preprocessing is done to enhance image contrast and detect edges. Candidate vertebra localization is then carried out by using a modified generalized Hough transform (GHT). Next, an adapted cost function is used to compute local voted centers and filter boundary data. Thereafter, a K-means clustering algorithm is applied to obtain clusters distribution corresponding to the targeted vertebrae. These clusters are combined with the vote parameters to detect vertebra centers. Rigid segmentation is then carried out by using GHT parameters. Finally, cervical spine curves are extracted to measure vertebra alignment. The proposed approach was successfully applied to a set of 66 high-resolution X-ray images. Robust detection was achieved in 97.5 % of the 330 tested cervical vertebrae. An automated vertebral identification method was developed and demonstrated to be robust to noise and occlusion. This work presents a first step toward an automated computer-aided diagnosis system for cervical spine trauma detection.
Aiken, L H; Smith, H L; Lake, E T
1997-01-01
Chile is a country with a relatively low prevalence of HIV infection, where successful prevention has the potential to change the future course of the epidemic. A controversial national prevention strategy based upon public education has emerged in response to characterizations of the epidemic as well-dispersed with a growing involvement of heterosexuals. This characterization is not consistent with the observed facts. There is a comparatively well-organized health care system in Santiago that is doing a good job of detecting HIV infection and already has in place the elements of a targeted intervention scheme. Chile should place priority on the use of the existing health care infrastructure for implementing both the traditional public health interventions for sexually transmitted diseases (contact tracing and partner notification) and the AIDS-necessitated strategy of focused counseling and education.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aghaei, Faranak; Tan, Maxine; Liu, Hong
Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from bothmore » tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.« less
Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail
2017-06-01
Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research.
Muenchberger, Heidi; Ehrlich, Carolyn; Parekh, Sanjoti; Crozier, Michelle
2016-01-01
To investigate the role of philanthropic micro-grants (maximum of $10,000) in the provision of aids and equipment for adults (aged 18-65 years of age) with complex disabilities and examine key trends in aids and equipment requests. This study examined, through quantitative and qualitative analysis, aids and equipment requests (n = 371 individual applications as represented by 136 service organisations in three Australian states) received by a not-for-profit (NFP) organisation across five consecutive years of an innovative micro-grants scheme. Findings highlight that living situation (living with family or living independently) significantly influences the nature of requests for respite, aids, equipment and home modifications. Specifically, people with complex disabilities living with their families require greater combined service provision (higher equipment need, respite support, home modifications) than those living independently (equipment need only). Type of disability did not influence request type. Qualitative data further indicated the "last resort" nature of respite requests, particularly for younger applicants (under 45 years of age) indicating critical unmet needs in the community. Results demonstrate the vital role of NFP organisations and philanthropic funds in supporting daily lifestyle aids and equipment (including respite) that might otherwise not be funded for people with complex disabilities. Although preliminary in its scope and prior to implementation of a National Disability Insurance Scheme (NDIS) in Australia, findings suggest both opportunity and risk to the uptake of community-based micro-grant funding: opportunity for users through the provision of essential aids and lifestyle supports, and risk through over-subscription and devolving of responsibility for critical support resources from public sector. The aids and equipment needs of adults under the age of 65 appear to have been underestimated, poorly defined and under-serviced. Service users need more assistance for their carers (i.e. equipment to facilitate safe lifting, urgent breaks from care routines) as well as aids, equipment and modifications to help them to live a more normal life (e.g. going to the beach). Living situation (i.e. independently or with family) significantly influences the nature and extent of aids and equipment requested. Supporting adults up to the age of 65 to live more independently would positively influence carers and family, while at the same time providing opportunities for more targeted personal care supports. Philanthropic and not-for-profit schemes are helping to address these needs through micro-grant schemes for purchases under $10 000, but sustainability is questioned. The introduction of Australia's National Disability Insurance Scheme (NDIS) presents an opportunity to consider the lifestyle needs of service users and carers, and determine who is best placed to address them.
Yao, Jianhua; Burns, Joseph E.; Sanoria, Vic; Summers, Ronald M.
2017-01-01
Abstract. Bone metastases are a frequent occurrence with cancer, and early detection can guide the patient’s treatment regimen. Metastatic bone disease can present in density extremes as sclerotic (high density) and lytic (low density) or in a continuum with an admixture of both sclerotic and lytic components. We design a framework to detect and characterize the varying spectrum of presentation of spine metastasis on positron emission tomography/computed tomography (PET/CT) data. A technique is proposed to synthesize CT and PET images to enhance the lesion appearance for computer detection. A combination of watershed, graph cut, and level set algorithms is first run to obtain the initial detections. Detections are then sent to multiple classifiers for sclerotic, lytic, and mixed lesions. The system was tested on 44 cases with 225 sclerotic, 139 lytic, and 92 mixed lesions. The results showed that sensitivity (false positive per patient) was 0.81 (2.1), 0.81 (1.3), and 0.76 (2.1) for sclerotic, lytic, and mixed lesions, respectively. It also demonstrates that using PET/CT data significantly improves the computer aided detection performance over using CT alone. PMID:28612036
A citizen science approach to optimising computer aided detection (CAD) in mammography
NASA Astrophysics Data System (ADS)
Ionescu, Georgia V.; Harkness, Elaine F.; Hulleman, Johan; Astley, Susan M.
2018-03-01
Computer aided detection (CAD) systems assist medical experts during image interpretation. In mammography, CAD systems prompt suspicious regions which help medical experts to detect early signs of cancer. This is a challenging task and prompts may appear in regions that are actually normal, whilst genuine cancers may be missed. The effect prompting has on readers performance is not fully known. In order to explore the effects of prompting errors, we have created an online game (Bat Hunt), designed for non-experts, that mirrors mammographic CAD. This allows us to explore a wider parameter space. Users are required to detect bats in images of flocks of birds, with image difficulty matched to the proportions of screening mammograms in different BI-RADS density categories. Twelve prompted conditions were investigated, along with unprompted detection. On average, players achieved a sensitivity of 0.33 for unprompted detection, and sensitivities of 0.75, 0.83, and 0.92 respectively for 70%, 80%, and 90% of targets prompted, regardless of CAD specificity. False prompts distract players from finding unprompted targets if they appear in the same image. Player performance decreases when the number of false prompts increases, and increases proportionally with prompting sensitivity. Median lowest d' was for unprompted condition (1.08) and the highest for sensitivity 90% and 0.5 false prompts per image (d'=4.48).
A deep-learning based automatic pulmonary nodule detection system
NASA Astrophysics Data System (ADS)
Zhao, Yiyuan; Zhao, Liang; Yan, Zhennan; Wolf, Matthias; Zhan, Yiqiang
2018-02-01
Lung cancer is the deadliest cancer worldwide. Early detection of lung cancer is a promising way to lower the risk of dying. Accurate pulmonary nodule detection in computed tomography (CT) images is crucial for early diagnosis of lung cancer. The development of computer-aided detection (CAD) system of pulmonary nodules contributes to making the CT analysis more accurate and with more efficiency. Recent studies from other groups have been focusing on lung cancer diagnosis CAD system by detecting medium to large nodules. However, to fully investigate the relevance between nodule features and cancer diagnosis, a CAD that is capable of detecting nodules with all sizes is needed. In this paper, we present a deep-learning based automatic all size pulmonary nodule detection system by cascading two artificial neural networks. We firstly use a U-net like 3D network to generate nodule candidates from CT images. Then, we use another 3D neural network to refine the locations of the nodule candidates generated from the previous subsystem. With the second sub-system, we bring the nodule candidates closer to the center of the ground truth nodule locations. We evaluate our system on a public CT dataset provided by the Lung Nodule Analysis (LUNA) 2016 grand challenge. The performance on the testing dataset shows that our system achieves 90% sensitivity with an average of 4 false positives per scan. This indicates that our system can be an aid for automatic nodule detection, which is beneficial for lung cancer diagnosis.
Okada, Tohru; Iwano, Shingo; Ishigaki, Takeo; Kitasaka, Takayuki; Hirano, Yasushi; Mori, Kensaku; Suenaga, Yasuhito; Naganawa, Shinji
2009-02-01
The ground-glass opacity (GGO) of lung cancer is identified only subjectively on computed tomography (CT) images as no quantitative characteristic has been defined for GGOs. We sought to define GGOs quantitatively and to differentiate between GGOs and solid-type lung cancers semiautomatically with a computer-aided diagnosis (CAD). High-resolution CT images of 100 pulmonary nodules (all peripheral lung cancers) were collected from our clinical records. Two radiologists traced the contours of nodules and distinguished GGOs from solid areas. The CT attenuation value of each area was measured. Differentiation between cancer types was assessed by a receiver-operating characteristic (ROC) analysis. The mean CT attenuation of the GGO areas was -618.4 +/- 212.2 HU, whereas that of solid areas was -68.1 +/- 230.3 HU. CAD differentiated between solidand GGO-type lung cancers with a sensitivity of 86.0% and specificity of 96.5% when the threshold value was -370 HU. Four nodules of mixed GGOs were incorrectly classified as the solid type. CAD detected 96.3% of GGO areas when the threshold between GGO and solid areas was 194 HU. Objective definition of GGO area by CT attenuation is feasible. This method is useful for semiautomatic differentiation between GGOs and solid types of lung cancer.
Computer aided detection of brain micro-bleeds in traumatic brain injury
NASA Astrophysics Data System (ADS)
van den Heuvel, T. L. A.; Ghafoorian, M.; van der Eerden, A. W.; Goraj, B. M.; Andriessen, T. M. J. C.; ter Haar Romeny, B. M.; Platel, B.
2015-03-01
Brain micro-bleeds (BMBs) are used as surrogate markers for detecting diffuse axonal injury in traumatic brain injury (TBI) patients. The location and number of BMBs have been shown to influence the long-term outcome of TBI. To further study the importance of BMBs for prognosis, accurate localization and quantification are required. The task of annotating BMBs is laborious, complex and prone to error, resulting in a high inter- and intra-reader variability. In this paper we propose a computer-aided detection (CAD) system to automatically detect BMBs in MRI scans of moderate to severe neuro-trauma patients. Our method consists of four steps. Step one: preprocessing of the data. Both susceptibility (SWI) and T1 weighted MRI scans are used. The images are co-registered, a brain-mask is generated, the bias field is corrected, and the image intensities are normalized. Step two: initial candidates for BMBs are selected as local minima in the processed SWI scans. Step three: feature extraction. BMBs appear as round or ovoid signal hypo-intensities on SWI. Twelve features are computed to capture these properties of a BMB. Step four: Classification. To identify BMBs from the set of local minima using their features, different classifiers are trained on a database of 33 expert annotated scans and 18 healthy subjects with no BMBs. Our system uses a leave-one-out strategy to analyze its performance. With a sensitivity of 90% and 1.3 false positives per BMB, our CAD system shows superior results compared to state-of-the-art BMB detection algorithms (developed for non-trauma patients).
Chai, Zhenhua; Zhao, T S
2014-07-01
In this paper, we propose a local nonequilibrium scheme for computing the flux of the convection-diffusion equation with a source term in the framework of the multiple-relaxation-time (MRT) lattice Boltzmann method (LBM). Both the Chapman-Enskog analysis and the numerical results show that, at the diffusive scaling, the present nonequilibrium scheme has a second-order convergence rate in space. A comparison between the nonequilibrium scheme and the conventional second-order central-difference scheme indicates that, although both schemes have a second-order convergence rate in space, the present nonequilibrium scheme is more accurate than the central-difference scheme. In addition, the flux computation rendered by the present scheme also preserves the parallel computation feature of the LBM, making the scheme more efficient than conventional finite-difference schemes in the study of large-scale problems. Finally, a comparison between the single-relaxation-time model and the MRT model is also conducted, and the results show that the MRT model is more accurate than the single-relaxation-time model, both in solving the convection-diffusion equation and in computing the flux.
Information hiding techniques for infrared images: exploring the state-of-the art and challenges
NASA Astrophysics Data System (ADS)
Pomponiu, Victor; Cavagnino, Davide; Botta, Marco; Nejati, Hossein
2015-10-01
The proliferation of Infrared technology and imaging systems enables a different perspective to tackle many computer vision problems in defense and security applications. Infrared images are widely used by the law enforcement, Homeland Security and military organizations to achieve a significant advantage or situational awareness, and thus is vital to protect these data against malicious attacks. Concurrently, sophisticated malware are developed which are able to disrupt the security and integrity of these digital media. For instance, illegal distribution and manipulation are possible malicious attacks to the digital objects. In this paper we explore the use of a new layer of defense for the integrity of the infrared images through the aid of information hiding techniques such as watermarking. In this context, we analyze the efficiency of several optimal decoding schemes for the watermark inserted into the Singular Value Decomposition (SVD) domain of the IR images using an additive spread spectrum (SS) embedding framework. In order to use the singular values (SVs) of the IR images with the SS embedding we adopt several restrictions that ensure that the values of the SVs will maintain their statistics. For both the optimal maximum likelihood decoder and sub-optimal decoders we assume that the PDF of SVs can be modeled by the Weibull distribution. Furthermore, we investigate the challenges involved in protecting and assuring the integrity of IR images such as data complexity and the error probability behavior, i.e., the probability of detection and the probability of false detection, for the applied optimal decoders. By taking into account the efficiency and the necessary auxiliary information for decoding the watermark, we discuss the suitable decoder for various operating situations. Experimental results are carried out on a large dataset of IR images to show the imperceptibility and efficiency of the proposed scheme against various attack scenarios.
Levrini, G; Sghedoni, R; Mori, C; Botti, A; Vacondio, R; Nitrosi, A; Iori, M; Nicoli, F
2011-10-01
The aim of this study was to investigate the efficacy of a dedicated software tool for automated volume measurement of breast lesions in contrast-enhanced (CE) magnetic resonance mammography (MRM). The size of 52 breast lesions with a known histopathological diagnosis (three benign, 49 malignant) was automatically evaluated using different techniques. The volume of all lesions was measured automatically (AVM) from CE 3D MRM examinations by means of a computer-aided detection (CAD) system and compared with the size estimates based on maximum diameter measurement (MDM) on MRM, ultrasonography (US), mammography and histopathology. Compared with histopathology as the reference method, AVM understimated lesion size by 4% on average. This result was similar to MDM (3% understimation, not significantly different) but significantly better than US and mammographic lesion measurements (24% and 33% size underestimation, respectively). AVM is as accurate as MDM but faster. Both methods are more accurate for size assessment of breast lesions compared with US and mammography.
Computer-aided detection in musculoskeletal projection radiography: A systematic review.
Gundry, M; Knapp, K; Meertens, R; Meakin, J R
2018-05-01
To investigated the accuracy of computer-aided detection (CAD) software in musculoskeletal projection radiography via a systematic review. Following selection screening, eligible studies were assessed for bias, and had their study characteristics extracted resulting in 22 studies being included. Of these 22 three studies had tested their CAD software in a clinical setting; the first study investigated vertebral fractures, reporting a sensitivity score of 69.3% with CAD, compared to 59.8% sensitivity without CAD. The second study tested dental caries diagnosis producing a sensitivity score of 68.8% and specificity of 94.1% with CAD, compared to sensitivity of 39.3% and specificity of 96.7% without CAD. The third indicated osteoporotic cases based on CAD, resulting in 100% sensitivity and 81.3% specificity. The current evidence reported shows a lack of development into the clinical testing phase; however the research does show future promise in the variation of different CAD systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Acharya, U Rajendra; Sree, S Vinitha; Krishnan, M Muthu Rama; Molinari, Filippo; Zieleźnik, Witold; Bardales, Ricardo H; Witkowska, Agnieszka; Suri, Jasjit S
2014-02-01
Computer-aided diagnostic (CAD) techniques aid physicians in better diagnosis of diseases by extracting objective and accurate diagnostic information from medical data. Hashimoto thyroiditis is the most common type of inflammation of the thyroid gland. The inflammation changes the structure of the thyroid tissue, and these changes are reflected as echogenic changes on ultrasound images. In this work, we propose a novel CAD system (a class of systems called ThyroScan) that extracts textural features from a thyroid sonogram and uses them to aid in the detection of Hashimoto thyroiditis. In this paradigm, we extracted grayscale features based on stationary wavelet transform from 232 normal and 294 Hashimoto thyroiditis-affected thyroid ultrasound images obtained from a Polish population. Significant features were selected using a Student t test. The resulting feature vectors were used to build and evaluate the following 4 classifiers using a 10-fold stratified cross-validation technique: support vector machine, decision tree, fuzzy classifier, and K-nearest neighbor. Using 7 significant features that characterized the textural changes in the images, the fuzzy classifier had the highest classification accuracy of 84.6%, sensitivity of 82.8%, specificity of 87.0%, and a positive predictive value of 88.9%. The proposed ThyroScan CAD system uses novel features to noninvasively detect the presence of Hashimoto thyroiditis on ultrasound images. Compared to manual interpretations of ultrasound images, the CAD system offers a more objective interpretation of the nature of the thyroid. The preliminary results presented in this work indicate the possibility of using such a CAD system in a clinical setting after evaluating it with larger databases in multicenter clinical trials.
A floor-map-aided WiFi/pseudo-odometry integration algorithm for an indoor positioning system.
Wang, Jian; Hu, Andong; Liu, Chunyan; Li, Xin
2015-03-24
This paper proposes a scheme for indoor positioning by fusing floor map, WiFi and smartphone sensor data to provide meter-level positioning without additional infrastructure. A topology-constrained K nearest neighbor (KNN) algorithm based on a floor map layout provides the coordinates required to integrate WiFi data with pseudo-odometry (P-O) measurements simulated using a pedestrian dead reckoning (PDR) approach. One method of further improving the positioning accuracy is to use a more effective multi-threshold step detection algorithm, as proposed by the authors. The "go and back" phenomenon caused by incorrect matching of the reference points (RPs) of a WiFi algorithm is eliminated using an adaptive fading-factor-based extended Kalman filter (EKF), taking WiFi positioning coordinates, P-O measurements and fused heading angles as observations. The "cross-wall" problem is solved based on the development of a floor-map-aided particle filter algorithm by weighting the particles, thereby also eliminating the gross-error effects originating from WiFi or P-O measurements. The performance observed in a field experiment performed on the fourth floor of the School of Environmental Science and Spatial Informatics (SESSI) building on the China University of Mining and Technology (CUMT) campus confirms that the proposed scheme can reliably achieve meter-level positioning.
NASA Astrophysics Data System (ADS)
Balsara, Dinshaw S.
2017-12-01
As computational astrophysics comes under pressure to become a precision science, there is an increasing need to move to high accuracy schemes for computational astrophysics. The algorithmic needs of computational astrophysics are indeed very special. The methods need to be robust and preserve the positivity of density and pressure. Relativistic flows should remain sub-luminal. These requirements place additional pressures on a computational astrophysics code, which are usually not felt by a traditional fluid dynamics code. Hence the need for a specialized review. The focus here is on weighted essentially non-oscillatory (WENO) schemes, discontinuous Galerkin (DG) schemes and PNPM schemes. WENO schemes are higher order extensions of traditional second order finite volume schemes. At third order, they are most similar to piecewise parabolic method schemes, which are also included. DG schemes evolve all the moments of the solution, with the result that they are more accurate than WENO schemes. PNPM schemes occupy a compromise position between WENO and DG schemes. They evolve an Nth order spatial polynomial, while reconstructing higher order terms up to Mth order. As a result, the timestep can be larger. Time-dependent astrophysical codes need to be accurate in space and time with the result that the spatial and temporal accuracies must be matched. This is realized with the help of strong stability preserving Runge-Kutta schemes and ADER (Arbitrary DERivative in space and time) schemes, both of which are also described. The emphasis of this review is on computer-implementable ideas, not necessarily on the underlying theory.
Is computer-aided interpretation of 99Tcm-HMPAO leukocyte scans better than the naked eye?
Almer, S; Peters, A M; Ekberg, S; Franzén, L; Granerus, G; Ström, M
1995-04-01
In order to compare visual interpretation of inflammation detected by leukocyte scintigraphy with that of different computer-aided quantification methods, 34 patients (25 with ulcerative colitis and 9 with endoscopically verified non-inflamed colonic mucosa), were investigated using 99Tcm-hexamethylpropyleneamine oxime (99Tcm-HMPAO) leukocyte scintigraphy and colonoscopy with biopsies. Scintigrams were obtained 45 min and 4 h after the injection of labelled cells. Computer-generated grading of seven colon segments using four different methods was performed on each scintigram for each patient. The same segments were graded independently using a 4-point visual scale. Endoscopic and histological inflammation were scored on 4-point scales. At 45 min, a positive correlation was found between endoscopic and scan gradings in individual colon segments when using visual grading and three of the four computer-aided methods (Spearman's rs = 0.30-0.64, P < 0.001). Histological grading correlated with visual grading and with two of the four computer-aided methods at 45 min (rs = 0.42-0.54, P < 0.001). At 4 h, all grading methods correlated positively with both endoscopic and histological assessment. The correlation coefficients were, in all but one instance, highest for the visual grading. As an inter-observer comparison to assess agreement between the visual gradings of two nuclear physicians, 14 additional patients (9 ulcerative colitis, 5 infectious enterocolitis) underwent leukocyte scintigraphy. Agreement assessed using kappa statistics was 0.54 at 45 min (P < 0.001). Separate data concerning the presence/absence of active inflammation showed a high kappa value (0.74, P < 0.001). Our results showed that a simple scintigraphic scoring system based on assessment using the human eye reflects colonic inflammation at least as well as computer-aided grading, and that highly correlated results can be achieved between different investigators.
Modeling Political Populations with Bacteria
NASA Astrophysics Data System (ADS)
Cleveland, Chris; Liao, David
2011-03-01
Results from lattice-based simulations of micro-environments with heterogeneous nutrient resources reveal that competition between wild-type and GASP rpoS819 strains of E. Coli offers mutual benefit, particularly in nutrient deprived regions. Our computational model spatially maps bacteria populations and energy sources onto a set of 3D lattices that collectively resemble the topology of North America. By implementing Wright-Fishcer re- production into a probabilistic leap-frog scheme, we observe populations of wild-type and GASP rpoS819 cells compete for resources and, yet, aid each other's long term survival. The connection to how spatial political ideologies map in a similar way is discussed.
Quantitative assessment of commercial filter 'aids' for red-green colour defectives.
Moreland, Jack D; Westland, Steven; Cheung, Vien; Dain, Steven J
2010-09-01
The claims made for 43 commercial filter 'aids', that they improve the colour discrimination of red-green colour defectives, are assessed for protanomaly and deuteranomaly by changes in the colour spacing of traffic signals (European Standard EN 1836:2005) and of the Farnsworth D15 test. Spectral transmittances of the 'aids' are measured and tristimulus values with and without 'aids' are computed using cone fundamentals and the spectral power distributions of either the D15 chips illuminated by CIE Illuminant C or of traffic signals. Chromaticities (l,s) are presented in cone excitation diagrams for protanomaly and deuteranomaly in terms of the relative excitation of their long (L), medium (M) and short (S) wavelength-sensitive cones. After correcting for non-uniform colour spacing in these diagrams, standard deviations parallel to the l and s axes are computed and enhancement factors E(l) and E(s) are derived as the ratio of 'aided' to 'unaided' standard deviations. Values of E(l) for traffic signals with most 'aids' are <1 and many do not meet the European signal detection standard. A few 'aids' have expansive E(l) factors but with inadequate utility: the largest being 1.2 for traffic signals and 1.3 for the D15 colours. Analyses, replicated for 19 'aids' from one manufacturer using 658 Munsell colours inside the D15 locus, yield E(l) factors within 1% of those found for the 16 D15 colours. © 2010 The Authors, Ophthalmic and Physiological Optics © 2010 The College of Optometrists.
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The results of the ERTS-1 investigations conducted by the Earth Observations Division at the NASA Lyndon B. Johnson Space Center are summarized in this report, which is an overview of documents detailing individual investigations. Conventional image interpretation and computer-aided classification procedures were the two basic techniques used in analyzing the data for detecting, identifying, locating, and measuring surface features related to earth resources. Data from the ERTS-1 multispectral scanner system were useful for all applications studied, which included agriculture, coastal and estuarine analysis, forestry, range, land use and urban land use, and signature extension. Percentage classification accuracies are cited for the conventional and computer-aided techniques.
A dental vision system for accurate 3D tooth modeling.
Zhang, Li; Alemzadeh, K
2006-01-01
This paper describes an active vision system based reverse engineering approach to extract the three-dimensional (3D) geometric information from dental teeth and transfer this information into Computer-Aided Design/Computer-Aided Manufacture (CAD/CAM) systems to improve the accuracy of 3D teeth models and at the same time improve the quality of the construction units to help patient care. The vision system involves the development of a dental vision rig, edge detection, boundary tracing and fast & accurate 3D modeling from a sequence of sliced silhouettes of physical models. The rig is designed using engineering design methods such as a concept selection matrix and weighted objectives evaluation chart. Reconstruction results and accuracy evaluation are presented on digitizing different teeth models.
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The Coastal Analysis Team of the Johnson Space Center conducted a 1-year investigation of ERTS-1 MSS data to determine its usefulness in coastal zone management. Galveston Bay, Texas, was the study area for evaluating both conventional image interpretation and computer-aided techniques. There was limited success in detecting, identifying and measuring areal extent of water bodies, turbidity zones, phytoplankton blooms, salt marshes, grasslands, swamps, and low wetlands using image interpretation techniques. Computer-aided techniques were generally successful in identifying these features. Aerial measurement of salt marshes accuracies ranged from 89 to 99 percent. Overall classification accuracy of all study sites was 89 percent for Level 1 and 75 percent for Level 2.
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Kitasaka, Takayuki; Furukawa, Kazuhiro; Watanabe, Osamu; Ando, Takafumi; Goto, Hidemi; Mori, Kensaku
2011-03-01
The purpose of this paper is to present a new method to detect ulcers, which is one of the symptoms of Crohn's disease, from CT images. Crohn's disease is an inflammatory disease of the digestive tract. Crohn's disease commonly affects the small intestine. An optical or a capsule endoscope is used for small intestine examinations. However, these endoscopes cannot pass through intestinal stenosis parts in some cases. A CT image based diagnosis allows a physician to observe whole intestine even if intestinal stenosis exists. However, because of the complicated shape of the small and large intestines, understanding of shapes of the intestines and lesion positions are difficult in the CT image based diagnosis. Computer-aided diagnosis system for Crohn's disease having automated lesion detection is required for efficient diagnosis. We propose an automated method to detect ulcers from CT images. Longitudinal ulcers make rough surface of the small and large intestinal wall. The rough surface consists of combination of convex and concave parts on the intestinal wall. We detect convex and concave parts on the intestinal wall by a blob and an inverse-blob structure enhancement filters. A lot of convex and concave parts concentrate on roughed parts. We introduce a roughness value to differentiate convex and concave parts concentrated on the roughed parts from the other on the intestinal wall. The roughness value effectively reduces false positives of ulcer detection. Experimental results showed that the proposed method can detect convex and concave parts on the ulcers.
Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Alewine, Neal Jon
1993-01-01
Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.
NASA Astrophysics Data System (ADS)
Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram
2016-04-01
Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.
Optimization of Breast Tomosynthesis Imaging Systems for Computer-Aided Detection
2011-05-01
R. Saunders, E. Samei, C. Badea, H. Yuan, K. Ghaghada, Y. Qi, L. Hedlund, and S. Mukundan, “Optimization of dual energy contrast enhanced breast...14 4 1 Introduction This is the final report for this body of research. Screen-film mammography and...digital mammography have been used for over 30 years in the early detection of cancer. The combination of screening and adjuvant therapies have led to
Applying a new mammographic imaging marker to predict breast cancer risk
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Danala, Gopichandh; Hollingsworth, Alan B.; Stoug, Rebecca G.; Pearce, Melanie; Liu, Hong; Zheng, Bin
2018-02-01
Identifying and developing new mammographic imaging markers to assist prediction of breast cancer risk has been attracting extensive research interest recently. Although mammographic density is considered an important breast cancer risk, its discriminatory power is lower for predicting short-term breast cancer risk, which is a prerequisite to establish a more effective personalized breast cancer screening paradigm. In this study, we presented a new interactive computer-aided detection (CAD) scheme to generate a new quantitative mammographic imaging marker based on the bilateral mammographic tissue density asymmetry to predict risk of cancer detection in the next subsequent mammography screening. An image database involving 1,397 women was retrospectively assembled and tested. Each woman had two digital mammography screenings namely, the "current" and "prior" screenings with a time interval from 365 to 600 days. All "prior" images were originally interpreted negative. In "current" screenings, these cases were divided into 3 groups, which include 402 positive, 643 negative, and 352 biopsy-proved benign cases, respectively. There is no significant difference of BIRADS based mammographic density ratings between 3 case groups (p < 0.6). When applying the CAD-generated imaging marker or risk model to classify between 402 positive and 643 negative cases using "prior" negative mammograms, the area under a ROC curve is 0.70+/-0.02 and the adjusted odds ratios show an increasing trend from 1.0 to 8.13 to predict the risk of cancer detection in the "current" screening. Study demonstrated that this new imaging marker had potential to yield significantly higher discriminatory power to predict short-term breast cancer risk.
NASA Astrophysics Data System (ADS)
Harkness, E. F.; Lim, Y. Y.; Wilson, M. W.; Haq, R.; Zhou, J.; Tate, C.; Maxwell, A. J.; Astley, S. M.; Gilbert, F. J.
2015-03-01
Digital breast tomosynthesis (DBT) addresses limitations of 2-D projection imaging for detection of masses. Microcalcification clusters may be more difficult to appreciate in DBT as individual calcifications within clusters may appear on different slices. This research aims to evaluate the performance of ImageChecker 3D Calc CAD v1.0. Women were recruited as part of the TOMMY trial. From the trial, 169 were included in this study. The DBT images were processed with the computer aided detection (CAD) algorithm. Three consultant radiologists reviewed the images and recorded whether CAD prompts were on or off target. 79/80 (98.8%) malignant cases had a prompt on the area of microcalcification. In these cases, there were 1-15 marks (median 5) with the majority of false prompts (n=326/431) due to benign (68%) and vascular (24%) calcifications. Of 89 normal/benign cases, there were 1-13 prompts (median 3), 27 (30%) had no prompts and the majority of false prompts (n=238) were benign (77%) calcifications. CAD is effective in prompting malignant microcalcification clusters and may overcome the difficulty of detecting clusters in slice images. Although there was a high rate of false prompts, further advances in the software may improve specificity.
Anyonic braiding in optical lattices
Zhang, Chuanwei; Scarola, V. W.; Tewari, Sumanta; Das Sarma, S.
2007-01-01
Topological quantum states of matter, both Abelian and non-Abelian, are characterized by excitations whose wavefunctions undergo nontrivial statistical transformations as one excitation is moved (braided) around another. Topological quantum computation proposes to use the topological protection and the braiding statistics of a non-Abelian topological state to perform quantum computation. The enormous technological prospect of topological quantum computation provides new motivation for experimentally observing a topological state. Here, we explicitly work out a realistic experimental scheme to create and braid the Abelian topological excitations in the Kitaev model built on a tunable robust system, a cold atom optical lattice. We also demonstrate how to detect the key feature of these excitations: their braiding statistics. Observation of this statistics would directly establish the existence of anyons, quantum particles that are neither fermions nor bosons. In addition to establishing topological matter, the experimental scheme we develop here can also be adapted to a non-Abelian topological state, supported by the same Kitaev model but in a different parameter regime, to eventually build topologically protected quantum gates. PMID:18000038
NASA Technical Reports Server (NTRS)
1991-01-01
This video discusses how the technology of computer modeling can improve the design and durability of artificial joints for human joint replacement surgery. Also, ultrasound, originally used to detect structural flaws in aircraft, can also be used to quickly assess the severity of a burn patient's injuries, thus aiding the healing process.
Computer-aided assessment of pulmonary disease in novel swine-origin H1N1 influenza on CT
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Dwyer, Andrew J.; Summers, Ronald M.; Mollura, Daniel J.
2011-03-01
The 2009 pandemic is a global outbreak of novel H1N1 influenza. Radiologic images can be used to assess the presence and severity of pulmonary infection. We develop a computer-aided assessment system to analyze the CT images from Swine-Origin Influenza A virus (S-OIV) novel H1N1 cases. The technique is based on the analysis of lung texture patterns and classification using a support vector machine (SVM). Pixel-wise tissue classification is computed from the SVM value. The method was validated on four H1N1 cases and ten normal cases. We demonstrated that the technique can detect regions of pulmonary abnormality in novel H1N1 patients and differentiate these regions from visually normal lung (area under the ROC curve is 0.993). This technique can also be applied to differentiate regions infected by different pulmonary diseases.
Yang, Chao-Yang; Wu, Cheng-Tse
2017-03-01
This research investigated the risks involved in bicycle riding while using various sensory modalities to deliver training information. To understand the risks associated with using bike computers, this study evaluated hazard perception performance through lab-based simulations of authentic riding conditions. Analysing hazard sensitivity (d') of signal detection theory, the rider's response time, and eye glances provided insights into the risks of using bike computers. In this study, 30 participants were tested with eight hazard perception tasks while they maintained a cadence of 60 ± 5 RPM and used bike computers with different sensory displays, namely visual, auditory, and tactile feedback signals. The results indicated that synchronously using different sense organs to receive cadence feedback significantly affects hazard perception performance; direct visual information leads to the worst rider distraction, with a mean sensitivity to hazards (d') of -1.03. For systems with multiple interacting sensory aids, auditory aids were found to result in the greatest reduction in sensitivity to hazards (d' mean = -0.57), whereas tactile sensory aids reduced the degree of rider distraction (d' mean = -0.23). Our work complements existing work in this domain by advancing the understanding of how to design devices that deliver information subtly, thereby preventing disruption of a rider's perception of road hazards. Copyright © 2016 Elsevier Ltd. All rights reserved.
Seruya, Mitchel; Fisher, Mark; Rodriguez, Eduardo D
2013-11-01
There has been rising interest in computer-aided design/computer-aided manufacturing for preoperative planning and execution of osseous free flap reconstruction. The purpose of this study was to compare outcomes between computer-assisted and conventional fibula free flap techniques for craniofacial reconstruction. A two-center, retrospective review was carried out on patients who underwent fibula free flap surgery for craniofacial reconstruction from 2003 to 2012. Patients were categorized by the type of reconstructive technique: conventional (between 2003 and 2009) or computer-aided design/computer-aided manufacturing (from 2010 to 2012). Demographics, surgical factors, and perioperative and long-term outcomes were compared. A total of 68 patients underwent microsurgical craniofacial reconstruction: 58 conventional and 10 computer-aided design and manufacturing fibula free flaps. By demographics, patients undergoing the computer-aided design/computer-aided manufacturing method were significantly older and had a higher rate of radiotherapy exposure compared with conventional patients. Intraoperatively, the median number of osteotomies was significantly higher (2.0 versus 1.0, p=0.002) and the median ischemia time was significantly shorter (120 minutes versus 170 minutes, p=0.004) for the computer-aided design/computer-aided manufacturing technique compared with conventional techniques; operative times were shorter for patients undergoing the computer-aided design/computer-aided manufacturing technique, although this did not reach statistical significance. Perioperative and long-term outcomes were equivalent for the two groups, notably, hospital length of stay, recipient-site infection, partial and total flap loss, and rate of soft-tissue and bony tissue revisions. Microsurgical craniofacial reconstruction using a computer-assisted fibula flap technique yielded significantly shorter ischemia times amidst a higher number of osteotomies compared with conventional techniques. Therapeutic, III.
Vectorized schemes for conical potential flow using the artificial density method
NASA Technical Reports Server (NTRS)
Bradley, P. F.; Dwoyer, D. L.; South, J. C., Jr.; Keen, J. M.
1984-01-01
A method is developed to determine solutions to the full-potential equation for steady supersonic conical flow using the artificial density method. Various update schemes used generally for transonic potential solutions are investigated. The schemes are compared for speed and robustness. All versions of the computer code have been vectorized and are currently running on the CYBER-203 computer. The update schemes are vectorized, where possible, either fully (explicit schemes) or partially (implicit schemes). Since each version of the code differs only by the update scheme and elements other than the update scheme are completely vectorizable, comparisons of computational effort and convergence rate among schemes are a measure of the specific scheme's performance. Results are presented for circular and elliptical cones at angle of attack for subcritical and supercritical crossflows.
Freyer, Marcus; Ale, Angelique; Schulz, Ralf B; Zientkowska, Marta; Ntziachristos, Vasilis; Englmeier, Karl-Hans
2010-01-01
The recent development of hybrid imaging scanners that integrate fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) allows the utilization of x-ray information as image priors for improving optical tomography reconstruction. To fully capitalize on this capacity, we consider a framework for the automatic and fast detection of different anatomic structures in murine XCT images. To accurately differentiate between different structures such as bone, lung, and heart, a combination of image processing steps including thresholding, seed growing, and signal detection are found to offer optimal segmentation performance. The algorithm and its utilization in an inverse FMT scheme that uses priors is demonstrated on mouse images.
NASA Astrophysics Data System (ADS)
Wormanns, Dag; Fiebich, Martin; Saidi, Mustafa; Diederich, Stefan; Heindel, Walter
2001-05-01
The purpose of the study was to evaluate a computer aided diagnosis (CAD) workstation with automatic detection of pulmonary nodules at low-dose spiral CT in a clinical setting for early detection of lung cancer. Two radiologists in consensus reported 88 consecutive spiral CT examinations. All examinations were reviewed using a UNIX-based CAD workstation with a self-developed algorithm for automatic detection of pulmonary nodules. The algorithm was designed to detect nodules with at least 5 mm diameter. The results of automatic nodule detection were compared to the consensus reporting of two radiologists as gold standard. Additional CAD findings were regarded as nodules initially missed by the radiologists or as false positive results. A total of 153 nodules were detected with all modalities (diameter: 85 nodules <5mm, 63 nodules 5-9 mm, 5 nodules >= 10 mm). Reasons for failure of automatic nodule detection were assessed. Sensitivity of radiologists for nodules >=5 mm was 85%, sensitivity of CAD was 38%. For nodules >=5 mm without pleural contact sensitivity was 84% for radiologists at 45% for CAD. CAD detected 15 (10%) nodules not mentioned in the radiologist's report but representing real nodules, among them 10 (15%) nodules with a diameter $GREW5 mm. Reasons for nodules missed by CAD include: exclusion because of morphological features during region analysis (33%), nodule density below the detection threshold (26%), pleural contact (33%), segmentation errors (5%) and other reasons (2%). CAD improves detection of pulmonary nodules at spiral CT significantly and is a valuable second opinion in a clinical setting for lung cancer screening. Optimization of region analysis and an appropriate density threshold have a potential for further improvement of automatic nodule detection.
A Mobile Decision Aid for Determining Detection Probabilities for Acoustic Targets
2002-08-01
propagation mobile application . Personal Computer Memory Card International Association, an organization of some 500 companies that has developed a...SENSOR: lHuman and possible outputs, it was felt that for a mobile application , the interface and number of output parameters should be kept simple...value could be computed on the server and transmitted back to the mobile application for display. FUTURE CAPABILITIES 2-D/3-D Displays The full ABFA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, M.A.; Craig, J.I.
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implementmore » the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.« less
Building a medical image processing algorithm verification database
NASA Astrophysics Data System (ADS)
Brown, C. Wayne
2000-06-01
The design of a database containing head Computed Tomography (CT) studies is presented, along with a justification for the database's composition. The database will be used to validate software algorithms that screen normal head CT studies from studies that contain pathology. The database is designed to have the following major properties: (1) a size sufficient for statistical viability, (2) inclusion of both normal (no pathology) and abnormal scans, (3) inclusion of scans due to equipment malfunction, technologist error, and uncooperative patients, (4) inclusion of data sets from multiple scanner manufacturers, (5) inclusion of data sets from different gender and age groups, and (6) three independent diagnosis of each data set. Designed correctly, the database will provide a partial basis for FDA (United States Food and Drug Administration) approval of image processing algorithms for clinical use. Our goal for the database is the proof of viability of screening head CT's for normal anatomy using computer algorithms. To put this work into context, a classification scheme for 'computer aided diagnosis' systems is proposed.
NASA Technical Reports Server (NTRS)
Krasteva, Denitza T.
1998-01-01
Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.
Classical command of quantum systems.
Reichardt, Ben W; Unger, Falk; Vazirani, Umesh
2013-04-25
Quantum computation and cryptography both involve scenarios in which a user interacts with an imperfectly modelled or 'untrusted' system. It is therefore of fundamental and practical interest to devise tests that reveal whether the system is behaving as instructed. In 1969, Clauser, Horne, Shimony and Holt proposed an experimental test that can be passed by a quantum-mechanical system but not by a system restricted to classical physics. Here we extend this test to enable the characterization of a large quantum system. We describe a scheme that can be used to determine the initial state and to classically command the system to evolve according to desired dynamics. The bipartite system is treated as two black boxes, with no assumptions about their inner workings except that they obey quantum physics. The scheme works even if the system is explicitly designed to undermine it; any misbehaviour is detected. Among its applications, our scheme makes it possible to test whether a claimed quantum computer is truly quantum. It also advances towards a goal of quantum cryptography: namely, the use of 'untrusted' devices to establish a shared random key, with security based on the validity of quantum physics.
Computer-Aided Facilities Management Systems (CAFM).
ERIC Educational Resources Information Center
Cyros, Kreon L.
Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…
An Indoor Positioning Method for Smartphones Using Landmarks and PDR.
Wang, Xi; Jiang, Mingxing; Guo, Zhongwen; Hu, Naijun; Sun, Zhongwei; Liu, Jing
2016-12-15
Recently location based services (LBS) have become increasingly popular in indoor environments. Among these indoor positioning techniques providing LBS, a fusion approach combining WiFi-based and pedestrian dead reckoning (PDR) techniques is drawing more and more attention of researchers. Although this fusion method performs well in some cases, it still has some limitations, such as heavy computation and inconvenience for real-time use. In this work, we study map information of a given indoor environment, analyze variations of WiFi received signal strength (RSS), define several kinds of indoor landmarks, and then utilize these landmarks to correct accumulated errors derived from PDR. This fusion scheme, called Landmark-aided PDR (LaP), is proved to be light-weight and suitable for real-time implementation by running an Android application designed for the experiment. We compared LaP with other PDR-based fusion approaches. Experimental results show that the proposed scheme can achieve a significant improvement with an average accuracy of 2.17 m.
An Indoor Positioning Method for Smartphones Using Landmarks and PDR †
Wang, Xi; Jiang, Mingxing; Guo, Zhongwen; Hu, Naijun; Sun, Zhongwei; Liu, Jing
2016-01-01
Recently location based services (LBS) have become increasingly popular in indoor environments. Among these indoor positioning techniques providing LBS, a fusion approach combining WiFi-based and pedestrian dead reckoning (PDR) techniques is drawing more and more attention of researchers. Although this fusion method performs well in some cases, it still has some limitations, such as heavy computation and inconvenience for real-time use. In this work, we study map information of a given indoor environment, analyze variations of WiFi received signal strength (RSS), define several kinds of indoor landmarks, and then utilize these landmarks to correct accumulated errors derived from PDR. This fusion scheme, called Landmark-aided PDR (LaP), is proved to be light-weight and suitable for real-time implementation by running an Android application designed for the experiment. We compared LaP with other PDR-based fusion approaches. Experimental results show that the proposed scheme can achieve a significant improvement with an average accuracy of 2.17 m. PMID:27983670
Compensation of Gaussian curvature in developable cones is local
NASA Astrophysics Data System (ADS)
Wang, Jin W.; Witten, Thomas A.
2009-10-01
We use the angular deficit scheme [V. Borrelli, F. Cazals, and J.-M. Morvan, Comput. Aided Geom. Des. 20, 319 (2003)] to determine the distribution of Gaussian curvature in developable cones (d-cones) [E. Cerda, S. Chaieb, F. Melo, and L. Mahadevan, Nature (London) 401, 46 (1999)] numerically. These d-cones are formed by pushing a thin elastic sheet into a circular container. Negative Gaussian curvatures are identified at the rim where the sheet touches the container. Around the rim there are two narrow bands with positive Gaussian curvatures. The integral of the (negative) Gaussian curvature near the rim is almost completely compensated by that of the two adjacent bands. This suggests that the Gauss-Bonnet theorem which constrains the integral of Gaussian curvature globally does not explain the spontaneous curvature cancellation phenomenon [T. Liang and T. A. Witten, Phys. Rev. E 73, 046604 (2006)]. The locality of the compensation seems to increase for decreasing d-cone thickness. The angular deficit scheme also provides a way to confirm the curvature cancellation phenomenon.
The compensation of Gaussian curvature in developable cones is local
NASA Astrophysics Data System (ADS)
Wang, Jin; Witten, Thomas
2009-03-01
We use the angular deficit scheme[1] to determine numerically the distribution of Gaussian curvature in developable cones(d-cones)[2] formed by forcing a flat elastic sheet into a circular container so that the sheet buckles. This provides a new way to confirm the vanishing of mean-curvature[3] at the rim where the sheet touches the container. This angular deficit scheme also allows us to explore the potential role of the Gauss-Bonnet theorem in explaining the mean-curvature vanishing phenomenon. The theorem's global constraint on curvature resembles the global conditions observed to be relevant for vanishing mean curvature. However, our result suggests that the Gauss-Bonnet theorem does not explain the vanishing of mean-curvature. [1] V. Borrelli, F. Cazals, and J.-M. Morvan, Computer Aided Geometric Design 20, 319 (2003). [2] E. Cerda, S. Chaieb, F. Melo, and L. Mahadevan, Nature 401, 46 (1999). [3] T. Liang and T. A. Witten, Phys. Rev. E 73, 046604 (2006).
Volumetric brain tumour detection from MRI using visual saliency.
Mitra, Somosmita; Banerjee, Subhashis; Hayashi, Yoichi
2017-01-01
Medical image processing has become a major player in the world of automatic tumour region detection and is tantamount to the incipient stages of computer aided design. Saliency detection is a crucial application of medical image processing, and serves in its potential aid to medical practitioners by making the affected area stand out in the foreground from the rest of the background image. The algorithm developed here is a new approach to the detection of saliency in a three dimensional multi channel MR image sequence for the glioblastoma multiforme (a form of malignant brain tumour). First we enhance the three channels, FLAIR (Fluid Attenuated Inversion Recovery), T2 and T1C (contrast enhanced with gadolinium) to generate a pseudo coloured RGB image. This is then converted to the CIE L*a*b* color space. Processing on cubes of sizes k = 4, 8, 16, the L*a*b* 3D image is then compressed into volumetric units; each representing the neighbourhood information of the surrounding 64 voxels for k = 4, 512 voxels for k = 8 and 4096 voxels for k = 16, respectively. The spatial distance of these voxels are then compared along the three major axes to generate the novel 3D saliency map of a 3D image, which unambiguously highlights the tumour region. The algorithm operates along the three major axes to maximise the computation efficiency while minimising loss of valuable 3D information. Thus the 3D multichannel MR image saliency detection algorithm is useful in generating a uniform and logistically correct 3D saliency map with pragmatic applicability in Computer Aided Detection (CADe). Assignment of uniform importance to all three axes proves to be an important factor in volumetric processing, which helps in noise reduction and reduces the possibility of compromising essential information. The effectiveness of the algorithm was evaluated over the BRATS MICCAI 2015 dataset having 274 glioma cases, consisting both of high grade and low grade GBM. The results were compared with that of the 2D saliency detection algorithm taken over the entire sequence of brain data. For all comparisons, the Area Under the receiver operator characteristic (ROC) Curve (AUC) has been found to be more than 0.99 ± 0.01 over various tumour types, structures and locations.
Security enhanced multi-factor biometric authentication scheme using bio-hash function.
Choi, Younsung; Lee, Youngsook; Moon, Jongho; Won, Dongho
2017-01-01
With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An's scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user's ID during login. Cao and Ge improved upon Younghwa An's scheme, but various security problems remained. This study demonstrates that Cao and Ge's scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge's scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost.
An effective and secure key-management scheme for hierarchical access control in E-medicine system.
Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit
2013-04-01
Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.
Fused man-machine classification schemes to enhance diagnosis of breast microcalcifications
NASA Astrophysics Data System (ADS)
Andreadis, Ioannis; Sevastianos, Chatzistergos; George, Spyrou; Konstantina, Nikita
2017-11-01
Computer aided diagnosis (CAD x ) approaches are developed towards the effective discrimination between benign and malignant clusters of microcalcifications. Different sources of information are exploited, such as features extracted from the image analysis of the region of interest, features related to the location of the cluster inside the breast, age of the patient and descriptors provided by the radiologists while performing their diagnostic task. A series of different CAD x schemes are implemented, each of which uses a different category of features and adopts a variety of machine learning algorithms and alternative image processing techniques. A novel framework is introduced where these independent diagnostic components are properly combined according to features critical to a radiologist in an attempt to identify the most appropriate CAD x schemes for the case under consideration. An open access database (Digital Database of Screening Mammography (DDSM)) has been elaborated to construct a large dataset with cases of varying subtlety, in order to ensure the development of schemes with high generalization ability, as well as extensive evaluation of their performance. The obtained results indicate that the proposed framework succeeds in improving the diagnostic procedure, as the achieved overall classification performance outperforms all the independent single diagnostic components, as well as the radiologists that assessed the same cases, in terms of accuracy, sensitivity, specificity and area under the curve following receiver operating characteristic analysis.
Computer-Aided Construction at Designing Reinforced Concrete Columns as Per Ec
NASA Astrophysics Data System (ADS)
Zielińska, M.; Grębowski, K.
2015-02-01
The article presents the authors' computer program for designing and dimensioning columns in reinforced concrete structures taking into account phenomena affecting their behaviour and information referring to design as per EC. The computer program was developed with the use of C++ programming language. The program guides the user through particular dimensioning stages: from introducing basic data such as dimensions, concrete class, reinforcing steel class and forces affecting the column, through calculating the creep coefficient taking into account the impact of imperfection depending on the support scheme and also the number of mating members at load shit, buckling length, to generating the interaction curve graph. The final result of calculations provides two dependence points calculated as per methods of nominal stiffness and nominal curvature. The location of those points relative to the limit curve determines whether the column load capacity is assured or has been exceeded. The content of the study describes in detail the operation of the computer program and the methodology and phenomena which are indispensable at designing axially and eccentrically the compressed members of reinforced concrete structures as per the European standards.
Near-IR photon number resolving detector design
NASA Astrophysics Data System (ADS)
Bogdanski, Jan; Huntington, Elanor H.
2013-05-01
Photon-Number-Resolving-Detection (PNRD) capability is crucial for many Quantum-Information (QI) applications, e.g. for Coherent-State-Quantum-Computing, Linear-Optics-Quantum-Computing. In Quantum-Key-Distribution and Quantum-Secret-Sharing over 1310/1550 nm fiber, two other important, defense and information security related, QI applications, it's crucial for the information transmission security to guarantee that the information carriers (photons) are single. Thus a PNRD can provide an additional security level against eavesdropping. Currently, there are at least a couple of promising PNRD technologies in the Near-Infrared, but all of them require cryogenic cooling. Thus a compact, portable PNRD, based on commercial Avalanche-Photo-Diodes (APDs), could be a very useful instrument for many QI experiments. For an APD-based PNRD, it is crucial to measure the APD-current in the beginning of the avalanche. Thus an efficient cancellation of the APD capacitive spikes is a necessary condition for the very weak APD current measurement. The detector's principle is based on two commercial, pair-matched InGaAs/InP APDs, connected in series. It leads to a great cancelation of the capacitive spikes caused by the narrow (300 ps), differential gate-pulses of maximum 4V amplitude assuming that both pulses are perfectly matched in regards to their phases, amplitudes, and shapes. The cancellation scheme could be used for other APD-technologies, e.g. Silicon, extending the detection spectrum from visible to NIR. The design distinguishes itself from other, APD-based, schemes by its scalability feature and its computer controlled cancellation of the capacitive spikes. Furthermore, both APDs could be equally used for the detection purpose, which opens a possibility for the odd-even photon number parity detection.
ERIC Educational Resources Information Center
Hagge, John
1986-01-01
Focuses on problems encountered with computer-aided writing instruction. Discusses conflicts caused by the computer classroom concept, some general paradoxes and ethical implications of computer-aided instruction. (EL)
Parahydrogen-enhanced zero-field nuclear magnetic resonance
NASA Astrophysics Data System (ADS)
Theis, T.; Ganssle, P.; Kervern, G.; Knappe, S.; Kitching, J.; Ledbetter, M. P.; Budker, D.; Pines, A.
2011-07-01
Nuclear magnetic resonance, conventionally detected in magnetic fields of several tesla, is a powerful analytical tool for the determination of molecular identity, structure and function. With the advent of prepolarization methods and detection schemes using atomic magnetometers or superconducting quantum interference devices, interest in NMR in fields comparable to the Earth's magnetic field and below (down to zero field) has been revived. Despite the use of superconducting quantum interference devices or atomic magnetometers, low-field NMR typically suffers from low sensitivity compared with conventional high-field NMR. Here we demonstrate direct detection of zero-field NMR signals generated through parahydrogen-induced polarization, enabling high-resolution NMR without the use of any magnets. The sensitivity is sufficient to observe spectra exhibiting 13C-1H scalar nuclear spin-spin couplings (known as J couplings) in compounds with 13C in natural abundance, without the need for signal averaging. The resulting spectra show distinct features that aid chemical fingerprinting.
In Vivo Fluorescence Imaging and Tracking of Circulating Cells and Therapeutic Nanoparticles
NASA Astrophysics Data System (ADS)
Markovic, Stacey
Noninvasive enumeration of rare circulating cells in small animals is of great importance in many areas of biomedical research, but most existing enumeration techniques involve drawing and enriching blood which is known to be problematic. Recently, small animal "in vivo flow cytometry" (IVFC) techniques have been developed, where cells flowing through small arterioles are counted continuously and noninvasively in vivo. However, higher sensitivity IVFC techniques are needed for studying low-abundance (<100/mL) circulating cells. To this end, we developed a macroscopic fluorescence imaging system and automated computer vision algorithm that allows in vivo detection, enumeration and tracking of circulating fluorescently labeled cells from multiple large blood vessels in the ear of a mouse. This technique ---"computer vision IVFC" (CV-IVFC) --- allows cell detection and enumeration at concentrations of 20 cells/mL. Performance of CV-IVFC was also characterized for low-contrast imaging scenarios, representing conditions of weak cell fluorescent labeling or high background tissue autofluorescence, and showed efficient tracking and enumeration of circulating cells with 50% sensitivity in contrast conditions degraded 2 orders of magnitude compared to in vivo testing supporting the potential utility of CV-IVFC in a range of biological models. Refinement of prior work in our lab of a separate rare-cell detection platform - "diffuse fluorescence flow cytometry" (DFFC) --- implemented a "frequency encoding" scheme by modulating two excitation lasers. Fluorescent light from both lasers can be simultaneously detected and split by frequency allowing for better discrimination of noise, sensitivity, and cell localization. The system design is described in detail and preliminary data is shown. Last, we developed a broad-field transmission fluorescence imaging system to observe nanoparticle (NP) diffusion in bulk biological tissue. Novel, implantable NP spacers allow controlled, long-term release of drugs. However, kinetics of NP (drug) diffusion over time is still poorly understood. Our imaging system allowed us to quantify diffusion of free dye and NPs of different sizes in vitro and in vivo. Subsequent analysis verified that there was continuous diffusion which could be controlled based on particle size. Continued use of this imaging system will aid optimization of NP spacers.
Mordang, Jan-Jurre; Gubern-Mérida, Albert; Bria, Alessandro; Tortorella, Francesco; den Heeten, Gerard; Karssemeijer, Nico
2017-04-01
Computer-aided detection (CADe) systems for mammography screening still mark many false positives. This can cause radiologists to lose confidence in CADe, especially when many false positives are obviously not suspicious to them. In this study, we focus on obvious false positives generated by microcalcification detection algorithms. We aim at reducing the number of obvious false-positive findings by adding an additional step in the detection method. In this step, a multiclass machine learning method is implemented in which dedicated classifiers learn to recognize the patterns of obvious false-positive subtypes that occur most frequently. The method is compared to a conventional two-class approach, where all false-positive subtypes are grouped together in one class, and to the baseline CADe system without the new false-positive removal step. The methods are evaluated on an independent dataset containing 1,542 screening examinations of which 80 examinations contain malignant microcalcifications. Analysis showed that the multiclass approach yielded a significantly higher sensitivity compared to the other two methods (P < 0.0002). At one obvious false positive per 100 images, the baseline CADe system detected 61% of the malignant examinations, while the systems with the two-class and multiclass false-positive reduction step detected 73% and 83%, respectively. Our study showed that by adding the proposed method to a CADe system, the number of obvious false positives can decrease significantly (P < 0.0002). © 2017 American Association of Physicists in Medicine.
Project-Based Teaching-Learning Computer-Aided Engineering Tools
ERIC Educational Resources Information Center
Simoes, J. A.; Relvas, C.; Moreira, R.
2004-01-01
Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…
NASA Astrophysics Data System (ADS)
Wantuch, Andrew C.; Vita, Joshua A.; Jimenez, Edward S.; Bray, Iliana E.
2016-10-01
Despite object detection, recognition, and identification being very active areas of computer vision research, many of the available tools to aid in these processes are designed with only photographs in mind. Although some algorithms used specifically for feature detection and identification may not take explicit advantage of the colors available in the image, they still under-perform on radiographs, which are grayscale images. We are especially interested in the robustness of these algorithms, specifically their performance on a preexisting database of X-ray radiographs in compressed JPEG form, with multiple ways of describing pixel information. We will review various aspects of the performance of available feature detection and identification systems, including MATLABs Computer Vision toolbox, VLFeat, and OpenCV on our non-ideal database. In the process, we will explore possible reasons for the algorithms' lessened ability to detect and identify features from the X-ray radiographs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee
2007-08-15
We have previously presented a knowledge-based computer-assisted detection (KB-CADe) system for the detection of mammographic masses. The system is designed to compare a query mammographic region with mammographic templates of known ground truth. The templates are stored in an adaptive knowledge database. Image similarity is assessed with information theoretic measures (e.g., mutual information) derived directly from the image histograms. A previous study suggested that the diagnostic performance of the system steadily improves as the knowledge database is initially enriched with more templates. However, as the database increases in size, an exhaustive comparison of the query case with each stored templatemore » becomes computationally burdensome. Furthermore, blind storing of new templates may result in redundancies that do not necessarily improve diagnostic performance. To address these concerns we investigated an entropy-based indexing scheme for improving the speed of analysis and for satisfying database storage restrictions without compromising the overall diagnostic performance of our KB-CADe system. The indexing scheme was evaluated on two different datasets as (i) a search mechanism to sort through the knowledge database, and (ii) a selection mechanism to build a smaller, concise knowledge database that is easier to maintain but still effective. There were two important findings in the study. First, entropy-based indexing is an effective strategy to identify fast a subset of templates that are most relevant to a given query. Only this subset could be analyzed in more detail using mutual information for optimized decision making regarding the query. Second, a selective entropy-based deposit strategy may be preferable where only high entropy cases are maintained in the knowledge database. Overall, the proposed entropy-based indexing scheme was shown to reduce the computational cost of our KB-CADe system by 55% to 80% while maintaining the system's diagnostic performance.« less
Osman, Onur; Ucan, Osman N.
2008-01-01
Objective The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Materials and Methods Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. Results The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Conclusion Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer-aided detection of lung nodules. PMID:18253070
Topology-independent shape modeling scheme
NASA Astrophysics Data System (ADS)
Malladi, Ravikanth; Sethian, James A.; Vemuri, Baba C.
1993-06-01
Developing shape models is an important aspect of computer vision research. Geometric and differential properties of the surface can be computed from shape models. They also aid the tasks of object representation and recognition. In this paper we present an innovative new approach for shape modeling which, while retaining important features of the existing methods, overcomes most of their limitations. Our technique can be applied to model arbitrarily complex shapes, shapes with protrusions, and to situations where no a priori assumption about the object's topology can be made. A single instance of our model, when presented with an image having more than one object of interest, has the ability to split freely to represent each object. Our method is based on the level set ideas developed by Osher & Sethian to follow propagating solid/liquid interfaces with curvature-dependent speeds. The interface is a closed, nonintersecting, hypersurface flowing along its gradient field with constant speed or a speed that depends on the curvature. We move the interface by solving a `Hamilton-Jacobi' type equation written for a function in which the interface is a particular level set. A speed function synthesized from the image is used to stop the interface in the vicinity of the object boundaries. The resulting equations of motion are solved by numerical techniques borrowed from the technology of hyperbolic conservation laws. An added advantage of this scheme is that it can easily be extended to any number of space dimensions. The efficacy of the scheme is demonstrated with numerical experiments on synthesized images and noisy medical images.
Leng, Shuang; Tan, Ru San; Chai, Kevin Tshun Chuan; Wang, Chao; Ghista, Dhanjoo; Zhong, Liang
2015-07-10
Most heart diseases are associated with and reflected by the sounds that the heart produces. Heart auscultation, defined as listening to the heart sound, has been a very important method for the early diagnosis of cardiac dysfunction. Traditional auscultation requires substantial clinical experience and good listening skills. The emergence of the electronic stethoscope has paved the way for a new field of computer-aided auscultation. This article provides an in-depth study of (1) the electronic stethoscope technology, and (2) the methodology for diagnosis of cardiac disorders based on computer-aided auscultation. The paper is based on a comprehensive review of (1) literature articles, (2) market (state-of-the-art) products, and (3) smartphone stethoscope apps. It covers in depth every key component of the computer-aided system with electronic stethoscope, from sensor design, front-end circuitry, denoising algorithm, heart sound segmentation, to the final machine learning techniques. Our intent is to provide an informative and illustrative presentation of the electronic stethoscope, which is valuable and beneficial to academics, researchers and engineers in the technical field, as well as to medical professionals to facilitate its use clinically. The paper provides the technological and medical basis for the development and commercialization of a real-time integrated heart sound detection, acquisition and quantification system.
Brain CT image similarity retrieval method based on uncertain location graph.
Pan, Haiwei; Li, Pengyuan; Li, Qing; Han, Qilong; Feng, Xiaoning; Gao, Linlin
2014-03-01
A number of brain computed tomography (CT) images stored in hospitals that contain valuable information should be shared to support computer-aided diagnosis systems. Finding the similar brain CT images from the brain CT image database can effectively help doctors diagnose based on the earlier cases. However, the similarity retrieval for brain CT images requires much higher accuracy than the general images. In this paper, a new model of uncertain location graph (ULG) is presented for brain CT image modeling and similarity retrieval. According to the characteristics of brain CT image, we propose a novel method to model brain CT image to ULG based on brain CT image texture. Then, a scheme for ULG similarity retrieval is introduced. Furthermore, an effective index structure is applied to reduce the searching time. Experimental results reveal that our method functions well on brain CT images similarity retrieval with higher accuracy and efficiency.
Jung, Jaewook; Kim, Jiye; Choi, Younsung; Won, Dongho
2016-08-16
In wireless sensor networks (WSNs), a registered user can login to the network and use a user authentication protocol to access data collected from the sensor nodes. Since WSNs are typically deployed in unattended environments and sensor nodes have limited resources, many researchers have made considerable efforts to design a secure and efficient user authentication process. Recently, Chen et al. proposed a secure user authentication scheme using symmetric key techniques for WSNs. They claim that their scheme assures high efficiency and security against different types of attacks. After careful analysis, however, we find that Chen et al.'s scheme is still vulnerable to smart card loss attack and is susceptible to denial of service attack, since it is invalid for verification to simply compare an entered ID and a stored ID in smart card. In addition, we also observe that their scheme cannot preserve user anonymity. Furthermore, their scheme cannot quickly detect an incorrect password during login phase, and this flaw wastes both communication and computational overheads. In this paper, we describe how these attacks work, and propose an enhanced anonymous user authentication and key agreement scheme based on a symmetric cryptosystem in WSNs to address all of the aforementioned vulnerabilities in Chen et al.'s scheme. Our analysis shows that the proposed scheme improves the level of security, and is also more efficient relative to other related schemes.
NASA Technical Reports Server (NTRS)
Narasimhan, Sriram; Dearden, Richard; Benazera, Emmanuel
2004-01-01
Fault detection and isolation are critical tasks to ensure correct operation of systems. When we consider stochastic hybrid systems, diagnosis algorithms need to track both the discrete mode and the continuous state of the system in the presence of noise. Deterministic techniques like Livingstone cannot deal with the stochasticity in the system and models. Conversely Bayesian belief update techniques such as particle filters may require many computational resources to get a good approximation of the true belief state. In this paper we propose a fault detection and isolation architecture for stochastic hybrid systems that combines look-ahead Rao-Blackwellized Particle Filters (RBPF) with the Livingstone 3 (L3) diagnosis engine. In this approach RBPF is used to track the nominal behavior, a novel n-step prediction scheme is used for fault detection and L3 is used to generate a set of candidates that are consistent with the discrepant observations which then continue to be tracked by the RBPF scheme.
Tiwari, Saumya; Reddy, Vijaya B.; Bhargava, Rohit; Raman, Jaishankar
2015-01-01
Rejection is a common problem after cardiac transplants leading to significant number of adverse events and deaths, particularly in the first year of transplantation. The gold standard to identify rejection is endomyocardial biopsy. This technique is complex, cumbersome and requires a lot of expertise in the correct interpretation of stained biopsy sections. Traditional histopathology cannot be used actively or quickly during cardiac interventions or surgery. Our objective was to develop a stain-less approach using an emerging technology, Fourier transform infrared (FT-IR) spectroscopic imaging to identify different components of cardiac tissue by their chemical and molecular basis aided by computer recognition, rather than by visual examination using optical microscopy. We studied this technique in assessment of cardiac transplant rejection to evaluate efficacy in an example of complex cardiovascular pathology. We recorded data from human cardiac transplant patients’ biopsies, used a Bayesian classification protocol and developed a visualization scheme to observe chemical differences without the need of stains or human supervision. Using receiver operating characteristic curves, we observed probabilities of detection greater than 95% for four out of five histological classes at 10% probability of false alarm at the cellular level while correctly identifying samples with the hallmarks of the immune response in all cases. The efficacy of manual examination can be significantly increased by observing the inherent biochemical changes in tissues, which enables us to achieve greater diagnostic confidence in an automated, label-free manner. We developed a computational pathology system that gives high contrast images and seems superior to traditional staining procedures. This study is a prelude to the development of real time in situ imaging systems, which can assist interventionists and surgeons actively during procedures. PMID:25932912
Finite element analysis of hysteresis effects in piezoelectric transducers
NASA Astrophysics Data System (ADS)
Simkovics, Reinhard; Landes, Hermann; Kaltenbacher, Manfred; Hoffelner, Johann; Lerch, Reinhard
2000-06-01
The design of ultrasonic transducers for high power applications, e.g. in medical therapy or production engineering, asks for effective computer aided design tools to analyze the occurring nonlinear effects. In this paper the finite-element-boundary-element package CAPA is presented that allows to model different types of electromechanical sensors and actuators. These transducers are based on various physical coupling effects, such as piezoelectricity or magneto- mechanical interactions. Their computer modeling requires the numerical solution of a multifield problem, such as coupled electric-mechanical fields or magnetic-mechanical fields as well as coupled mechanical-acoustic fields. With the reported software environment we are able to compute the dynamic behavior of electromechanical sensors and actuators by taking into account geometric nonlinearities, nonlinear wave propagation and ferroelectric as well as magnetic material nonlinearities. After a short introduction to the basic theory of the numerical calculation schemes, two practical examples will demonstrate the applicability of the numerical simulation tool. As a first example an ultrasonic thickness mode transducer consisting of a piezoceramic material used for high power ultrasound production is examined. Due to ferroelectric hysteresis, higher order harmonics can be detected in the actuators input current. Also in case of electrical and mechanical prestressing a resonance frequency shift occurs, caused by ferroelectric hysteresis and nonlinear dependencies of the material coefficients on electric field and mechanical stresses. As a second example, a power ultrasound transducer used in HIFU-therapy (high intensity focused ultrasound) is presented. Due to the compressibility and losses in the propagating fluid a nonlinear shock wave generation can be observed. For both examples a good agreement between numerical simulation and experimental data has been achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Shiju; Qian, Wei; Guan, Yubao
2016-06-15
Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initiallymore » computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.« less
Advances in borehole geophysics for hydrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, P.H.
1982-01-01
Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less
A hybrid lung and vessel segmentation algorithm for computer aided detection of pulmonary embolism
NASA Astrophysics Data System (ADS)
Raghupathi, Laks; Lakare, Sarang
2009-02-01
Advances in multi-detector technology have made CT pulmonary angiography (CTPA) a popular radiological tool for pulmonary emboli (PE) detection. CTPA provide rich detail of lung anatomy and is a useful diagnostic aid in highlighting even very small PE. However analyzing hundreds of slices is laborious and time-consuming for the practicing radiologist which may also cause misdiagnosis due to the presence of various PE look-alike. Computer-aided diagnosis (CAD) can be a potential second reader in providing key diagnostic information. Since PE occurs only in vessel arteries, it is important to mark this region of interest (ROI) during CAD preprocessing. In this paper, we present a new lung and vessel segmentation algorithm for extracting contrast-enhanced vessel ROI in CTPA. Existing approaches to segmentation either provide only the larger lung area without highlighting the vessels or is computationally prohibitive. In this paper, we propose a hybrid lung and vessel segmentation which uses an initial lung ROI and determines the vessels through a series of refinement steps. We first identify a coarse vessel ROI by finding the "holes" from the lung ROI. We then use the initial ROI as seed-points for a region-growing process while carefully excluding regions which are not relevant. The vessel segmentation mask covers 99% of the 259 PE from a real-world set of 107 CTPA. Further, our algorithm increases the net sensitivity of a prototype CAD system by 5-9% across all PE categories in the training and validation data sets. The average run-time of algorithm was only 100 seconds on a standard workstation.
Cao, Mingshu; Fraser, Karl; Rasmussen, Susanne
2013-10-31
Mass spectrometry coupled with chromatography has become the major technical platform in metabolomics. Aided by peak detection algorithms, the detected signals are characterized by mass-over-charge ratio (m/z) and retention time. Chemical identities often remain elusive for the majority of the signals. Multi-stage mass spectrometry based on electrospray ionization (ESI) allows collision-induced dissociation (CID) fragmentation of selected precursor ions. These fragment ions can assist in structural inference for metabolites of low molecular weight. Computational investigations of fragmentation spectra have increasingly received attention in metabolomics and various public databases house such data. We have developed an R package "iontree" that can capture, store and analyze MS2 and MS3 mass spectral data from high throughput metabolomics experiments. The package includes functions for ion tree construction, an algorithm (distMS2) for MS2 spectral comparison, and tools for building platform-independent ion tree (MS2/MS3) libraries. We have demonstrated the utilization of the package for the systematic analysis and annotation of fragmentation spectra collected in various metabolomics platforms, including direct infusion mass spectrometry, and liquid chromatography coupled with either low resolution or high resolution mass spectrometry. Assisted by the developed computational tools, we have demonstrated that spectral trees can provide informative evidence complementary to retention time and accurate mass to aid with annotating unknown peaks. These experimental spectral trees once subjected to a quality control process, can be used for querying public MS2 databases or de novo interpretation. The putatively annotated spectral trees can be readily incorporated into reference libraries for routine identification of metabolites.
NASA Astrophysics Data System (ADS)
Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming
2016-04-01
This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features.
Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming
2016-04-15
This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features.
Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming
2016-01-01
This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features. PMID:27079888
NASA Astrophysics Data System (ADS)
Schröder, Jörg; Viebahn, Nils; Wriggers, Peter; Auricchio, Ferdinando; Steeger, Karl
2017-09-01
In this work we investigate different mixed finite element formulations for the detection of critical loads for the possible occurrence of bifurcation and limit points. In detail, three- and two-field formulations for incompressible and quasi-incompressible materials are analyzed. In order to apply various penalty functions for the volume dilatation in displacement/pressure mixed elements we propose a new consistent scheme capturing the non linearities of the penalty constraints. It is shown that for all mixed formulations, which can be reduced to a generalized displacement scheme, a straight forward stability analysis is possible. However, problems based on the classical saddle-point structure require a different analyses based on the change of the signature of the underlying matrix system. The basis of these investigations is the work from Auricchio et al. (Comput Methods Appl Mech Eng 194:1075-1092, 2005, Comput Mech 52:1153-1167, 2013).
Henriksen, Emilie L; Carlsen, Jonathan F; Vejborg, Ilse Mm; Nielsen, Michael B; Lauridsen, Carsten A
2018-01-01
Background Early detection of breast cancer (BC) is crucial in lowering the mortality. Purpose To present an overview of studies concerning computer-aided detection (CAD) in screening mammography for early detection of BC and compare diagnostic accuracy and recall rates (RR) of single reading (SR) with SR + CAD and double reading (DR) with SR + CAD. Material and Methods PRISMA guidelines were used as a review protocol. Articles on clinical trials concerning CAD for detection of BC in a screening population were included. The literature search resulted in 1522 records. A total of 1491 records were excluded by abstract and 18 were excluded by full text reading. A total of 13 articles were included. Results All but two studies from the SR vs. SR + CAD group showed an increased sensitivity and/or cancer detection rate (CDR) when adding CAD. The DR vs. SR + CAD group showed no significant differences in sensitivity and CDR. Adding CAD to SR increased the RR and decreased the specificity in all but one study. For the DR vs. SR + CAD group only one study reported a significant difference in RR. Conclusion All but two studies showed an increase in RR, sensitivity and CDR when adding CAD to SR. Compared to DR no statistically significant differences in sensitivity or CDR were reported. Additional studies based on organized population-based screening programs, with longer follow-up time, high-volume readers, and digital mammography are needed to evaluate the efficacy of CAD.
Semi-automated detection of anterior cruciate ligament injury from MRI.
Štajduhar, Ivan; Mamula, Mihaela; Miletić, Damir; Ünal, Gözde
2017-03-01
A radiologist's work in detecting various injuries or pathologies from radiological scans can be tiresome, time consuming and prone to errors. The field of computer-aided diagnosis aims to reduce these factors by introducing a level of automation in the process. In this paper, we deal with the problem of detecting the presence of anterior cruciate ligament (ACL) injury in a human knee. We examine the possibility of aiding the diagnosis process by building a decision-support model for detecting the presence of milder ACL injuries (not requiring operative treatment) and complete ACL ruptures (requiring operative treatment) from sagittal plane magnetic resonance (MR) volumes of human knees. Histogram of oriented gradient (HOG) descriptors and gist descriptors are extracted from manually selected rectangular regions of interest enveloping the wider cruciate ligament area. Performance of two machine-learning models is explored, coupled with both feature extraction methods: support vector machine (SVM) and random forests model. Model generalisation properties were determined by performing multiple iterations of stratified 10-fold cross validation whilst observing the area under the curve (AUC) score. Sagittal plane knee joint MR data was retrospectively gathered at the Clinical Hospital Centre Rijeka, Croatia, from 2007 until 2014. Type of ACL injury was established in a double-blind fashion by comparing the retrospectively set diagnosis against the prospective opinion of another radiologist. After clean up, the resulting dataset consisted of 917 usable labelled exam sequences of left or right knees. Experimental results suggest that a linear-kernel SVM learned from HOG descriptors has the best generalisation properties among the experimental models compared, having an area under the curve of 0.894 for the injury-detection problem and 0.943 for the complete-rupture-detection problem. Although the problem of performing semi-automated ACL-injury diagnosis by observing knee-joint MR volumes alone is a difficult one, experimental results suggest potential clinical application of computer-aided decision making, both for detecting milder injuries and detecting complete ruptures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Georgia Univ., Athens. Div. of Vocational Education.
This guide describes the requirements for courses in computer-aided design and computer-aided manufacturing (CAD/CAM) that are part of engineering technology programs conducted in vocational-technical schools in Georgia. The guide is organized in five sections. The first section provides a rationale for occupations in design and in production,…
Parametric optimization of optical signal detectors employing the direct photodetection scheme
NASA Astrophysics Data System (ADS)
Kirakosiants, V. E.; Loginov, V. A.
1984-08-01
The problem of optimization of the optical signal detection scheme parameters is addressed using the concept of a receiver with direct photodetection. An expression is derived which accurately approximates the field of view (FOV) values obtained by a direct computer minimization of the probability of missing a signal; optimum values of the receiver FOV were found for different atmospheric conditions characterized by the number of coherence spots and the intensity fluctuations of a plane wave. It is further pointed out that the criterion presented can be possibly used for parametric optimization of detectors operating in accordance with the Neumann-Pearson criterion.
Employment Opportunities for the Handicapped in Programmable Automation.
ERIC Educational Resources Information Center
Swift, Richard; Leneway, Robert
A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…
Slaughter, Susan E; Zimmermann, Gabrielle L; Nuspl, Megan; Hanson, Heather M; Albrecht, Lauren; Esmail, Rosmin; Sauro, Khara; Newton, Amanda S; Donald, Maoliosa; Dyson, Michele P; Thomson, Denise; Hartling, Lisa
2017-12-06
As implementation science advances, the number of interventions to promote the translation of evidence into healthcare, health systems, or health policy is growing. Accordingly, classification schemes for these knowledge translation (KT) interventions have emerged. A recent scoping review identified 51 classification schemes of KT interventions to integrate evidence into healthcare practice; however, the review did not evaluate the quality of the classification schemes or provide detailed information to assist researchers in selecting a scheme for their context and purpose. This study aimed to further examine and assess the quality of these classification schemes of KT interventions, and provide information to aid researchers when selecting a classification scheme. We abstracted the following information from each of the original 51 classification scheme articles: authors' objectives; purpose of the scheme and field of application; socioecologic level (individual, organizational, community, system); adaptability (broad versus specific); target group (patients, providers, policy-makers), intent (policy, education, practice), and purpose (dissemination versus implementation). Two reviewers independently evaluated the methodological quality of the development of each classification scheme using an adapted version of the AGREE II tool. Based on these assessments, two independent reviewers reached consensus about whether to recommend each scheme for researcher use, or not. Of the 51 original classification schemes, we excluded seven that were not specific classification schemes, not accessible or duplicates. Of the remaining 44 classification schemes, nine were not recommended. Of the 35 recommended classification schemes, ten focused on behaviour change and six focused on population health. Many schemes (n = 29) addressed practice considerations. Fewer schemes addressed educational or policy objectives. Twenty-five classification schemes had broad applicability, six were specific, and four had elements of both. Twenty-three schemes targeted health providers, nine targeted both patients and providers and one targeted policy-makers. Most classification schemes were intended for implementation rather than dissemination. Thirty-five classification schemes of KT interventions were developed and reported with sufficient rigour to be recommended for use by researchers interested in KT in healthcare. Our additional categorization and quality analysis will aid in selecting suitable classification schemes for research initiatives in the field of implementation science.
Automatic Residential/Commercial Classification of Parcels with Solar Panel Detections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Omitaomu, Olufemi A; Kotikot, Susan
A computational method to automatically detect solar panels on rooftops to aid policy and financial assessment of solar distributed generation. The code automatically classifies parcels containing solar panels in the U.S. as residential or commercial. The code allows the user to specify an input dataset containing parcels and detected solar panels, and then uses information about the parcels and solar panels to automatically classify the rooftops as residential or commercial using machine learning techniques. The zip file containing the code includes sample input and output datasets for the Boston and DC areas.
NASA Astrophysics Data System (ADS)
Ling, Jun
Achieving reliable underwater acoustic communications (UAC) has long been recognized as a challenging problem owing to the scarce bandwidth available and the reverberant spread in both time and frequency domains. To pursue high data rates, we consider a multi-input multi-output (MIMO) UAC system, and our focus is placed on two main issues regarding a MIMO UAC system: (1) channel estimation, which involves the design of the training sequences and the development of a reliable channel estimation algorithm, and (2) symbol detection, which requires interference cancelation schemes due to simultaneous transmission from multiple transducers. To enhance channel estimation performance, we present a cyclic approach for designing training sequences with good auto- and cross-correlation properties, and a channel estimation algorithm called the iterative adaptive approach (IAA). Sparse channel estimates can be obtained by combining IAA with the Bayesian information criterion (BIC). Moreover, we present sparse learning via iterative minimization (SLIM) and demonstrate that SLIM gives similar performance to IAA but at a much lower computational cost. Furthermore, an extension of the SLIM algorithm is introduced to estimate the sparse and frequency modulated acoustic channels. The extended algorithm is referred to as generalization of SLIM (GoSLIM). Regarding symbol detection, a linear minimum mean-squared error based detection scheme, called RELAX-BLAST, which is a combination of vertical Bell Labs layered space-time (V-BLAST) algorithm and the cyclic principle of the RELAX algorithm, is presented and it is shown that RELAX-BLAST outperforms V-BLAST. We show that RELAX-BLAST can be implemented efficiently by making use of the conjugate gradient method and diagonalization properties of circulant matrices. This fast implementation approach requires only simple fast Fourier transform operations and facilitates parallel implementations. The effectiveness of the proposed MIMO schemes is verified by both computer simulations and experimental results obtained by analyzing the measurements acquired in multiple in-water experiments.
THE DEVELOPMENT OF A CLASSIFICATION SCHEME OF CONTEXTUAL AIDS.
ERIC Educational Resources Information Center
AMES, WILBUR S.
A STUDY WAS CONDUCTED TO DETERMINE FROM THE VERBAL RESPONSES OF READERS THE TYPES OF CONTEXTUAL AIDS THAT SERVE AS CLUES TO THE MEANINGS THAT MIGHT BE ATTACHED TO SIMULATED WORDS AND TO CLASSIFY THESE CONTEXTUAL AIDS ON THE BASIS OF THE ELEMENTS OF THE VERBAL CONTEXT THAT WAS UTILIZED BY THE READER. AN INTROSPECTIVE TECHNIQUE WAS USED IN…
Digital Noise Reduction: An Overview
Bentler, Ruth; Chiou, Li-Kuei
2006-01-01
Digital noise reduction schemes are being used in most hearing aids currently marketed. Unlike the earlier analog schemes, these manufacturer-specific algorithms are developed to acoustically analyze the incoming signal and alter the gain/output characteristics according to their predetermined rules. Although most are modulation-based schemes (ie, differentiating speech from noise based on temporal characteristics), spectral subtraction techniques are being applied as well. The purpose of this article is to overview these schemes in terms of their differences and similarities. PMID:16959731
High order filtering methods for approximating hyperbolic systems of conservation laws
NASA Technical Reports Server (NTRS)
Lafon, F.; Osher, S.
1991-01-01
The essentially nonoscillatory (ENO) schemes, while potentially useful in the computation of discontinuous solutions of hyperbolic conservation-law systems, are computationally costly relative to simple central-difference methods. A filtering technique is presented which employs central differencing of arbitrarily high-order accuracy except where a local test detects the presence of spurious oscillations and calls upon the full ENO apparatus to remove them. A factor-of-three speedup is thus obtained over the full-ENO method for a wide range of problems, with high-order accuracy in regions of smooth flow.
Advanced instrumentation concepts for environmental control subsystems
NASA Technical Reports Server (NTRS)
Yang, P. Y.; Schubert, F. H.; Gyorki, J. R.; Wynveen, R. A.
1978-01-01
Design, evaluation and demonstration of advanced instrumentation concepts for improving performance of manned spacecraft environmental control and life support systems were successfully completed. Concepts to aid maintenance following fault detection and isolation were defined. A computer-guided fault correction instruction program was developed and demonstrated in a packaged unit which also contains the operator/system interface.
Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies
El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush
2013-01-01
This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282
Automated and real-time segmentation of suspicious breast masses using convolutional neural network
Gregory, Adriana; Denis, Max; Meixner, Duane D.; Bayat, Mahdi; Whaley, Dana H.; Fatemi, Mostafa; Alizad, Azra
2018-01-01
In this work, a computer-aided tool for detection was developed to segment breast masses from clinical ultrasound (US) scans. The underlying Multi U-net algorithm is based on convolutional neural networks. Under the Mayo Clinic Institutional Review Board protocol, a prospective study of the automatic segmentation of suspicious breast masses was performed. The cohort consisted of 258 female patients who were clinically identified with suspicious breast masses and underwent clinical US scan and breast biopsy. The computer-aided detection tool effectively segmented the breast masses, achieving a mean Dice coefficient of 0.82, a true positive fraction (TPF) of 0.84, and a false positive fraction (FPF) of 0.01. By avoiding positioning of an initial seed, the algorithm is able to segment images in real time (13–55 ms per image), and can have potential clinical applications. The algorithm is at par with a conventional seeded algorithm, which had a mean Dice coefficient of 0.84 and performs significantly better (P< 0.0001) than the original U-net algorithm. PMID:29768415