Sample records for proposed technique involves

  1. Irrigated rice area estimation using remote sensing techniques: Project's proposal and preliminary results. [Rio Grande do Sul, Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Moreira, M. A.; Novaes, R. A.

    1984-01-01

    The development of a methodology for annual estimates of irrigated rice crop in the State of Rio Grande do Sul, Brazil, using remote sensing techniques is proposed. The project involves interpretation, digital analysis, and sampling techniques of LANDSAT imagery. Results are discussed from a preliminary phase for identifying and evaluating irrigated rice crop areas in four counties of the State, for the crop year 1982/1983. This first phase involved just visual interpretation techniques of MSS/LANDSAT images.

  2. E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction

    ERIC Educational Resources Information Center

    Takemura, Atsushi

    2016-01-01

    This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…

  3. Activity Detection and Retrieval for Image and Video Data with Limited Training

    DTIC Science & Technology

    2015-06-10

    applications. Here we propose two techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a... automata . For our second approach to segmentation, we employ a region based segmentation technique that is capable of handling intensity inhomogeneity...techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a mixture of Gaussian is fitted to the

  4. Least-squares deconvolution of evoked potentials and sequence optimization for multiple stimuli under low-jitter conditions.

    PubMed

    Bardy, Fabrice; Dillon, Harvey; Van Dun, Bram

    2014-04-01

    Rapid presentation of stimuli in an evoked response paradigm can lead to overlap of multiple responses and consequently difficulties interpreting waveform morphology. This paper presents a deconvolution method allowing overlapping multiple responses to be disentangled. The deconvolution technique uses a least-squared error approach. A methodology is proposed to optimize the stimulus sequence associated with the deconvolution technique under low-jitter conditions. It controls the condition number of the matrices involved in recovering the responses. Simulations were performed using the proposed deconvolution technique. Multiple overlapping responses can be recovered perfectly in noiseless conditions. In the presence of noise, the amount of error introduced by the technique can be controlled a priori by the condition number of the matrix associated with the used stimulus sequence. The simulation results indicate the need for a minimum amount of jitter, as well as a sufficient number of overlap combinations to obtain optimum results. An aperiodic model is recommended to improve reconstruction. We propose a deconvolution technique allowing multiple overlapping responses to be extracted and a method of choosing the stimulus sequence optimal for response recovery. This technique may allow audiologists, psychologists, and electrophysiologists to optimize their experimental designs involving rapidly presented stimuli, and to recover evoked overlapping responses. Copyright © 2013 International Federation of Clinical Neurophysiology. All rights reserved.

  5. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  6. High-speed technique based on a parallel projection correlation procedure for digital image correlation

    NASA Astrophysics Data System (ADS)

    Zaripov, D. I.; Renfu, Li

    2018-05-01

    The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.

  7. Epileptic seizure detection in EEG signal using machine learning techniques.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2018-03-01

    Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.

  8. An element search ant colony technique for solving virtual machine placement problem

    NASA Astrophysics Data System (ADS)

    Srija, J.; Rani John, Rose; Kanaga, Grace Mary, Dr.

    2017-09-01

    The data centres in the cloud environment play a key role in providing infrastructure for ubiquitous computing, pervasive computing, mobile computing etc. This computing technique tries to utilize the available resources in order to provide services. Hence maintaining the resource utilization without wastage of power consumption has become a challenging task for the researchers. In this paper we propose the direct guidance ant colony system for effective mapping of virtual machines to the physical machine with maximal resource utilization and minimal power consumption. The proposed algorithm has been compared with the existing ant colony approach which is involved in solving virtual machine placement problem and thus the proposed algorithm proves to provide better result than the existing technique.

  9. An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2011-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.

  10. Application of Cross-Correlation Greens Function Along With FDTD for Fast Computation of Envelope Correlation Coefficient Over Wideband for MIMO Antennas

    NASA Astrophysics Data System (ADS)

    Sarkar, Debdeep; Srivastava, Kumar Vaibhav

    2017-02-01

    In this paper, the concept of cross-correlation Green's functions (CGF) is used in conjunction with the finite difference time domain (FDTD) technique for calculation of envelope correlation coefficient (ECC) of any arbitrary MIMO antenna system over wide frequency band. Both frequency-domain (FD) and time-domain (TD) post-processing techniques are proposed for possible application with this FDTD-CGF scheme. The FDTD-CGF time-domain (FDTD-CGF-TD) scheme utilizes time-domain signal processing methods and exhibits significant reduction in ECC computation time as compared to the FDTD-CGF frequency domain (FDTD-CGF-FD) scheme, for high frequency-resolution requirements. The proposed FDTD-CGF based schemes can be applied for accurate and fast prediction of wideband ECC response, instead of the conventional scattering parameter based techniques which have several limitations. Numerical examples of the proposed FDTD-CGF techniques are provided for two-element MIMO systems involving thin-wire half-wavelength dipoles in parallel side-by-side as well as orthogonal arrangements. The results obtained from the FDTD-CGF techniques are compared with results from commercial electromagnetic solver Ansys HFSS, to verify the validity of proposed approach.

  11. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  12. Paratransit Integration Workshop Proceedings, October 12-13, 1977

    DOT National Transportation Integrated Search

    1978-08-01

    Experts in the field of demand responsive transportation dealt with seven areas of concern for a proposed manual of paratransit services: planning and institutional constraints, involvement of the private operator, estimation techniques, system desig...

  13. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  14. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  15. Combined Use of Terrestrial Laser Scanning and IR Thermography Applied to a Historical Building

    PubMed Central

    Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia

    2015-01-01

    The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques. PMID:25609042

  16. Combined use of terrestrial laser scanning and IR thermography applied to a historical building.

    PubMed

    Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia

    2014-12-24

    The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques.

  17. Newmark-Beta-FDTD method for super-resolution analysis of time reversal waves

    NASA Astrophysics Data System (ADS)

    Shi, Sheng-Bing; Shao, Wei; Ma, Jing; Jin, Congjun; Wang, Xiao-Hua

    2017-09-01

    In this work, a new unconditionally stable finite-difference time-domain (FDTD) method with the split-field perfectly matched layer (PML) is proposed for the analysis of time reversal (TR) waves. The proposed method is very suitable for multiscale problems involving microstructures. The spatial and temporal derivatives in this method are discretized by the central difference technique and Newmark-Beta algorithm, respectively, and the derivation results in the calculation of a banded-sparse matrix equation. Since the coefficient matrix keeps unchanged during the whole simulation process, the lower-upper (LU) decomposition of the matrix needs to be performed only once at the beginning of the calculation. Moreover, the reverse Cuthill-Mckee (RCM) technique, an effective preprocessing technique in bandwidth compression of sparse matrices, is used to improve computational efficiency. The super-resolution focusing of TR wave propagation in two- and three-dimensional spaces is included to validate the accuracy and efficiency of the proposed method.

  18. Contemporary Topics in Science

    ERIC Educational Resources Information Center

    Aronstein, Laurence W.; Beam, Kathryn J.

    1974-01-01

    Discusses the offering of a Science for Poets course at the General Science Department of State University College at Buffalo, involving objectives, methods, and grouping techniques. Included are lists of problems proposed by teachers and students in the course. (CC)

  19. Empirical State Error Covariance Matrix for Batch Estimation

    NASA Technical Reports Server (NTRS)

    Frisbee, Joe

    2015-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.

  20. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    NASA Astrophysics Data System (ADS)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah

    2016-06-01

    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  1. Visual Modelling of Data Warehousing Flows with UML Profiles

    NASA Astrophysics Data System (ADS)

    Pardillo, Jesús; Golfarelli, Matteo; Rizzi, Stefano; Trujillo, Juan

    Data warehousing involves complex processes that transform source data through several stages to deliver suitable information ready to be analysed. Though many techniques for visual modelling of data warehouses from the static point of view have been devised, only few attempts have been made to model the data flows involved in a data warehousing process. Besides, each attempt was mainly aimed at a specific application, such as ETL, OLAP, what-if analysis, data mining. Data flows are typically very complex in this domain; for this reason, we argue, designers would greatly benefit from a technique for uniformly modelling data warehousing flows for all applications. In this paper, we propose an integrated visual modelling technique for data cubes and data flows. This technique is based on UML profiling; its feasibility is evaluated by means of a prototype implementation.

  2. A Big Empty Space

    ERIC Educational Resources Information Center

    Blake, Anthony; Francis, David

    1973-01-01

    Approaches to developing management ability include systematic techniques, mental enlargement, self-analysis, and job-related counseling. A method is proposed to integrate them into a responsive program involving depth understanding, vision of the future, specialization commitment to change, and self-monitoring control. (MS)

  3. Simultaneous determination of apparent tortuosity and microstructure length scale and shape: Application to rigid open cell foams

    NASA Astrophysics Data System (ADS)

    Gómez Álvarez-Arenas, T. E.; de la Fuente, S.; González Gómez, I.

    2006-05-01

    A novel experimental technique based on phase spectroscopy and through transmission of high-frequency airborne ultrasonic pulses is used to study rigid open cell foams. Phase velocity shows an anomalous relaxation like behavior which is attributed to a frequency variation of the apparent tortuosity. An explanation is proposed in terms of the relationship between the different length scales involved: microstructure and macroscopic behavior. The experimental technique together with the proposed apparent tortuosity scheme provides a novel and unique procedure to determine simultaneously tortuosity and characteristic length dimension and shape of the solid constituent of foams and porous materials in general.

  4. Adjustment of localized alveolar ridge defects by soft tissue transplantation to improve mucogingival esthetics: a proposal for clinical classification and an evaluation of procedures.

    PubMed

    Studer, S; Naef, R; Schärer, P

    1997-12-01

    Esthetically correct treatment of a localized alveolar ridge defect is a frequent prosthetic challenge. Such defects can be overcome not only by a variety of prosthetic means, but also by several periodontal surgical techniques, notably soft tissue augmentations. Preoperative classification of the localized alveolar ridge defect can be greatly useful in evaluating the prognosis and technical difficulties involved. A semiquantitative classification, dependent on the severity of vertical and horizontal dimensional loss, is proposed to supplement the recognized qualitative classification of a ridge defect. Various methods of soft tissue augmentation are evaluated, based on initial volumetric measurements. The roll flap technique is proposed when the problem is related to ridge quality (single-tooth defect with little horizontal and vertical loss). Larger defects in which a volumetric problem must be solved are corrected through the subepithelial connective tissue technique. Additional mucogingival problems (eg, insufficient gingival width, high frenum, gingival scarring, or tattoo) should not be corrected simultaneously with augmentation procedures. In these cases, the onlay transplant technique is favored.

  5. Efficient live face detection to counter spoof attack in face recognition systems

    NASA Astrophysics Data System (ADS)

    Biswas, Bikram Kumar; Alam, Mohammad S.

    2015-03-01

    Face recognition is a critical tool used in almost all major biometrics based security systems. But recognition, authentication and liveness detection of the face of an actual user is a major challenge because an imposter or a non-live face of the actual user can be used to spoof the security system. In this research, a robust technique is proposed which detects liveness of faces in order to counter spoof attacks. The proposed technique uses a three-dimensional (3D) fast Fourier transform to compare spectral energies of a live face and a fake face in a mathematically selective manner. The mathematical model involves evaluation of energies of selective high frequency bands of average power spectra of both live and non-live faces. It also carries out proper recognition and authentication of the face of the actual user using the fringe-adjusted joint transform correlation technique, which has been found to yield the highest correlation output for a match. Experimental tests show that the proposed technique yields excellent results for identifying live faces.

  6. Broadband photonic transport between waveguides by adiabatic elimination

    NASA Astrophysics Data System (ADS)

    Oukraou, Hassan; Coda, Virginie; Rangelov, Andon A.; Montemezzani, Germano

    2018-02-01

    We propose an adiabatic method for the robust transfer of light between the two outer waveguides in a three-waveguide directional coupler. Unlike the established technique inherited from stimulated Raman adiabatic passage (STIRAP), the method proposed here is symmetric with respect to an exchange of the left and right waveguides in the structure and permits the transfer in both directions. The technique uses the adiabatic elimination of the middle waveguide together with level crossing and adiabatic passage in an effective two-state system involving only the external waveguides. It requires a strong detuning between the outer and the middle waveguide and does not rely on the adiabatic transfer state (dark state) underlying the STIRAP process. The suggested technique is generalized to an array of N waveguides and verified by numerical beam propagation calculations.

  7. Comments on: Accuracy of Raman Lidar Water Vapor Calibration and its Applicability to Long-Term Measurements

    NASA Technical Reports Server (NTRS)

    Whiteman, David N.; Venable, Demetrius; Landulfo, Eduardo

    2012-01-01

    In a recent publication, LeBlanc and McDermid proposed a hybrid calibration technique for Raman water vapor lidar involving a tungsten lamp and radiosondes. Measurements made with the lidar telescope viewing the calibration lamp were used to stabilize the lidar calibration determined by comparison with radiosonde. The technique provided a significantly more stable calibration constant than radiosondes used alone. The technique involves the use of a calibration lamp in a fixed position in front of the lidar receiver aperture. We examine this configuration and find that such a configuration likely does not properly sample the full lidar system optical efficiency. While the technique is a useful addition to the use of radiosondes alone for lidar calibration, it is important to understand the scenarios under which it will not provide an accurate quantification of system optical efficiency changes. We offer examples of these scenarios.

  8. EVALUATION OF ACID DEPOSITION MODELS USING PRINCIPAL COMPONENT SPACES

    EPA Science Inventory

    An analytical technique involving principal components analysis is proposed for use in the evaluation of acid deposition models. elationships among model predictions are compared to those among measured data, rather than the more common one-to-one comparison of predictions to mea...

  9. General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique

    NASA Astrophysics Data System (ADS)

    Feng, Shijie; Zhang, Yuzhen; Chen, Qian; Zuo, Chao; Li, Rubin; Shen, Guochen

    2014-08-01

    This paper presents a general solution for realizing high dynamic range three-dimensional (3-D) shape measurement based on fringe projection. Three concrete techniques are involved in the solution for measuring object with large range of reflectivity (LRR) or one with shiny specular surface. For the first technique, the measured surface reflectivities are sub-divided into several groups based on its histogram distribution, then the optimal exposure time for each group can be predicted adaptively so that the bright as well as dark areas on the measured surface are able to be handled without any compromise. Phase-shifted images are then captured at the calculated exposure times and a composite phase-shifted image is generated by extracting the optimally exposed pixels in the raw fringes images. For the second technique, it is proposed by introducing two orthogonal polarizers which are placed separately in front of the camera and projector into the first technique and the third one is developed by combining the second technique with the strategy of properly altering the angle between the transmission axes of the two polarizers. Experimental results show that the first technique can effectively improve the measurement accuracy of diffuse objects with LRR, the second one is capable of measuring object with weak specular reflection (WSR: e.g. shiny plastic surface) and the third can inspect surface with strong specular reflection (SSR: e.g. highlight on aluminum alloy) precisely. Further, more complex scene, such as the one with LRR and WSR, or even the one simultaneously involving LRR, WSR and SSR, can be measured accurately by the proposed solution.

  10. Analysis of the dynamic behavior of structures using the high-rate GNSS-PPP method combined with a wavelet-neural model: Numerical simulation and experimental tests

    NASA Astrophysics Data System (ADS)

    Kaloop, Mosbeh R.; Yigit, Cemal O.; Hu, Jong W.

    2018-03-01

    Recently, the high rate global navigation satellite system-precise point positioning (GNSS-PPP) technique has been used to detect the dynamic behavior of structures. This study aimed to increase the accuracy of the extraction oscillation properties of structural movements based on the high-rate (10 Hz) GNSS-PPP monitoring technique. A developmental model based on the combination of wavelet package transformation (WPT) de-noising and neural network prediction (NN) was proposed to improve the dynamic behavior of structures for GNSS-PPP method. A complicated numerical simulation involving highly noisy data and 13 experimental cases with different loads were utilized to confirm the efficiency of the proposed model design and the monitoring technique in detecting the dynamic behavior of structures. The results revealed that, when combined with the proposed model, GNSS-PPP method can be used to accurately detect the dynamic behavior of engineering structures as an alternative to relative GNSS method.

  11. Digital stereo-holographic microscopy for studying three-dimensional particle dynamics

    NASA Astrophysics Data System (ADS)

    Byeon, Hyeokjun; Go, Taesik; Lee, Sang Joon

    2018-06-01

    A digital stereo-holographic microscopy (DsHM) with two viewing angles is proposed to measure 3D information of microscale particles. This approach includes two volumetric recordings and numerical reconstruction, and it involves the combination of separately reconstructed holograms. The 3D positional information of a particle was determined by searching the center of the overlapped reconstructed volume. After confirming the proposed technique using static spherical particles, the 3D information of moving particles suspended in a Hagen-Poiseiulle flow was successfully obtained. Moreover, the 3D information of nonspherical particles, including ellipsoidal particles and red blood cells, were measured using the proposed technique. In addition to 3D positional information, the orientation and shape of the test samples were obtained from the plane images by slicing the overlapped volume perpendicular to the directions of the image recordings. This DsHM technique will be useful in analyzing the 3D dynamic behavior of various nonspherical particles, which cannot be measured by conventional digital holographic microscopy.

  12. Neural network for image compression

    NASA Astrophysics Data System (ADS)

    Panchanathan, Sethuraman; Yeap, Tet H.; Pilache, B.

    1992-09-01

    In this paper, we propose a new scheme for image compression using neural networks. Image data compression deals with minimization of the amount of data required to represent an image while maintaining an acceptable quality. Several image compression techniques have been developed in recent years. We note that the coding performance of these techniques may be improved by employing adaptivity. Over the last few years neural network has emerged as an effective tool for solving a wide range of problems involving adaptivity and learning. A multilayer feed-forward neural network trained using the backward error propagation algorithm is used in many applications. However, this model is not suitable for image compression because of its poor coding performance. Recently, a self-organizing feature map (SOFM) algorithm has been proposed which yields a good coding performance. However, this algorithm requires a long training time because the network starts with random initial weights. In this paper we have used the backward error propagation algorithm (BEP) to quickly obtain the initial weights which are then used to speedup the training time required by the SOFM algorithm. The proposed approach (BEP-SOFM) combines the advantages of the two techniques and, hence, achieves a good coding performance in a shorter training time. Our simulation results demonstrate the potential gains using the proposed technique.

  13. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  14. Modeling Complex Dynamic Interactions of Nonlinear, Aeroelastic, Multistage, and Localization Phenomena in Turbine Engines

    DTIC Science & Technology

    2011-02-25

    fast method of predicting the number of iterations needed for converged results. A new hybrid technique is proposed to predict the convergence history...interchanging between the modes, whereas a smaller veering (or crossing) region shows fast mode switching. Then, the nonlinear vibration re- sponse of the...problems of interest involve dynamic ( fast ) crack propagation, then the nodes selected by the proposed approach at some time instant might not

  15. A Novel Approach with Time-Splitting Spectral Technique for the Coupled Schrödinger-Boussinesq Equations Involving Riesz Fractional Derivative

    NASA Astrophysics Data System (ADS)

    Saha Ray, S.

    2017-09-01

    In the present paper the Riesz fractional coupled Schrödinger-Boussinesq (S-B) equations have been solved by the time-splitting Fourier spectral (TSFS) method. This proposed technique is utilized for discretizing the Schrödinger like equation and further, a pseudospectral discretization has been employed for the Boussinesq-like equation. Apart from that an implicit finite difference approach has also been proposed to compare the results with the solutions obtained from the time-splitting technique. Furthermore, the time-splitting method is proved to be unconditionally stable. The error norms along with the graphical solutions have also been presented here. Supported by NBHM, Mumbai, under Department of Atomic Energy, Government of India vide Grant No. 2/48(7)/2015/NBHM (R.P.)/R&D II/11403

  16. Manufacturing the Gas Diffusion Layer for PEM Fuel Cell Using a Novel 3D Printing Technique and Critical Assessment of the Challenges Encountered

    PubMed Central

    Singamneni, Sarat; Ramos, Maximiano; Al-Jumaily, Ahmed M

    2017-01-01

    The conventional gas diffusion layer (GDL) of polymer electrolyte membrane (PEM) fuel cells incorporates a carbon-based substrate, which suffers from electrochemical oxidation as well as mechanical degradation, resulting in reduced durability and performance. In addition, it involves a complex manufacturing process to produce it. The proposed technique aims to resolve both these issues by an advanced 3D printing technique, namely selective laser sintering (SLS). In the proposed work, polyamide (PA) is used as the base powder and titanium metal powder is added at an optimised level to enhance the electrical conductivity, thermal, and mechanical properties. The application of selective laser sintering to fabricate a robust gas diffusion substrate for PEM fuel cell applications is quite novel and is attempted here for the first time. PMID:28773156

  17. Manufacturing the Gas Diffusion Layer for PEM Fuel Cell Using a Novel 3D Printing Technique and Critical Assessment of the Challenges Encountered.

    PubMed

    Jayakumar, Arunkumar; Singamneni, Sarat; Ramos, Maximiano; Al-Jumaily, Ahmed M; Pethaiah, Sethu Sundar

    2017-07-14

    The conventional gas diffusion layer (GDL) of polymer electrolyte membrane (PEM) fuel cells incorporates a carbon-based substrate, which suffers from electrochemical oxidation as well as mechanical degradation, resulting in reduced durability and performance. In addition, it involves a complex manufacturing process to produce it. The proposed technique aims to resolve both these issues by an advanced 3D printing technique, namely selective laser sintering (SLS). In the proposed work, polyamide (PA) is used as the base powder and titanium metal powder is added at an optimised level to enhance the electrical conductivity, thermal, and mechanical properties. The application of selective laser sintering to fabricate a robust gas diffusion substrate for PEM fuel cell applications is quite novel and is attempted here for the first time.

  18. Improving semi-automated segmentation by integrating learning with active sampling

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Okada, Kazunori; Brown, Matthew

    2012-02-01

    Interactive segmentation algorithms such as GrowCut usually require quite a few user interactions to perform well, and have poor repeatability. In this study, we developed a novel technique to boost the performance of the interactive segmentation method GrowCut involving: 1) a novel "focused sampling" approach for supervised learning, as opposed to conventional random sampling; 2) boosting GrowCut using the machine learned results. We applied the proposed technique to the glioblastoma multiforme (GBM) brain tumor segmentation, and evaluated on a dataset of ten cases from a multiple center pharmaceutical drug trial. The results showed that the proposed system has the potential to reduce user interaction while maintaining similar segmentation accuracy.

  19. Automatic Parameterization Strategy for Cardiac Electrophysiology Simulations.

    PubMed

    Costa, Caroline Mendonca; Hoetzl, Elena; Rocha, Bernardo Martins; Prassl, Anton J; Plank, Gernot

    2013-10-01

    Driven by recent advances in medical imaging, image segmentation and numerical techniques, computer models of ventricular electrophysiology account for increasingly finer levels of anatomical and biophysical detail. However, considering the large number of model parameters involved parameterization poses a major challenge. A minimum requirement in combined experimental and modeling studies is to achieve good agreement in activation and repolarization sequences between model and experiment or patient data. In this study, we propose basic techniques which aid in determining bidomain parameters to match activation sequences. An iterative parameterization algorithm is implemented which determines appropriate bulk conductivities which yield prescribed velocities. In addition, a method is proposed for splitting the computed bulk conductivities into individual bidomain conductivities by prescribing anisotropy ratios.

  20. Random Process Simulation for stochastic fatigue analysis. Ph.D. Thesis - Rice Univ., Houston, Tex.

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.

    1988-01-01

    A simulation technique is described which directly synthesizes the extrema of a random process and is more efficient than the Gaussian simulation method. Such a technique is particularly useful in stochastic fatigue analysis because the required stress range moment E(R sup m), is a function only of the extrema of the random stress process. The family of autoregressive moving average (ARMA) models is reviewed and an autoregressive model is presented for modeling the extrema of any random process which has a unimodal power spectral density (psd). The proposed autoregressive technique is found to produce rainflow stress range moments which compare favorably with those computed by the Gaussian technique and to average 11.7 times faster than the Gaussian technique. The autoregressive technique is also adapted for processes having bimodal psd's. The adaptation involves using two autoregressive processes to simulate the extrema due to each mode and the superposition of these two extrema sequences. The proposed autoregressive superposition technique is 9 to 13 times faster than the Gaussian technique and produces comparable values for E(R sup m) for bimodal psd's having the frequency of one mode at least 2.5 times that of the other mode.

  1. Endovascular Treatment of a Symptomatic Thoracoabdominal Aortic Aneurysm by Chimney and Periscope Techniques for Total Visceral and Renal Artery Revascularization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cariati, Maurizio, E-mail: cariati.maurizio@sancarlo.mi.it; Mingazzini, Pietro; Dallatana, Raffaello

    2013-05-02

    Conventional endovascular therapy of thoracoabdominal aortic aneurysm with involving visceral and renal arteries is limited by the absence of a landing zone for the aortic endograft. Solutions have been proposed to overcome the problem of no landing zone; however, most of them are not feasible in urgent and high-risk patients. We describe a case that was successfully treated by total endovascular technique with a two-by-two chimney-and-periscope approach in a patient with acute symptomatic type IV thoracoabdominal aortic aneurysm with supra-anastomotic aneurysm formation involving the renal and visceral arteries and a pseduaneurismatic sac localized in the left ileopsoas muscle.

  2. Experience factors in performing periodic physical evaluations

    NASA Technical Reports Server (NTRS)

    Hoffman, A. A.

    1969-01-01

    The lack of scientific basis in the so-called periodic health examinations on military personnel inclusive of the Executive Health Program is outlined. This latter program can well represent a management tool of the company involved in addition to being a status symbol. A multiphasic screening technique is proposed in conjunction with an automated medical history questionnaire for preventive occupational medicine methodology. The need to collate early sickness consultation or clinic visit histories with screening techniques is emphasized.

  3. Recognition of human activity characteristics based on state transitions modeling technique

    NASA Astrophysics Data System (ADS)

    Elangovan, Vinayak; Shirkhodaie, Amir

    2012-06-01

    Human Activity Discovery & Recognition (HADR) is a complex, diverse and challenging task but yet an active area of ongoing research in the Department of Defense. By detecting, tracking, and characterizing cohesive Human interactional activity patterns, potential threats can be identified which can significantly improve situation awareness, particularly, in Persistent Surveillance Systems (PSS). Understanding the nature of such dynamic activities, inevitably involves interpretation of a collection of spatiotemporally correlated activities with respect to a known context. In this paper, we present a State Transition model for recognizing the characteristics of human activities with a link to a prior contextbased ontology. Modeling the state transitions between successive evidential events determines the activities' temperament. The proposed state transition model poses six categories of state transitions including: Human state transitions of Object handling, Visibility, Entity-entity relation, Human Postures, Human Kinematics and Distance to Target. The proposed state transition model generates semantic annotations describing the human interactional activities via a technique called Casual Event State Inference (CESI). The proposed approach uses a low cost kinect depth camera for indoor and normal optical camera for outdoor monitoring activities. Experimental results are presented here to demonstrate the effectiveness and efficiency of the proposed technique.

  4. 24 CFR 91.115 - Citizen participation plan; States.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... encourage participation by low- and moderate-income persons, particularly those living in slum and blighted areas and in areas where CDBG funds are proposed to be used, and by residents of predominantly low- and... alternative public involvement techniques that encourage a shared vision of change for the community and the...

  5. 24 CFR 91.115 - Citizen participation plan; States.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... encourage participation by low- and moderate-income persons, particularly those living in slum and blighted areas and in areas where CDBG funds are proposed to be used, and by residents of predominantly low- and... alternative public involvement techniques that encourage a shared vision of change for the community and the...

  6. 24 CFR 91.115 - Citizen participation plan; States.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... encourage participation by low- and moderate-income persons, particularly those living in slum and blighted areas and in areas where CDBG funds are proposed to be used, and by residents of predominantly low- and... alternative public involvement techniques that encourage a shared vision of change for the community and the...

  7. A combined NLP-differential evolution algorithm approach for the optimization of looped water distribution systems

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.

    2011-08-01

    This paper proposes a novel optimization approach for the least cost design of looped water distribution systems (WDSs). Three distinct steps are involved in the proposed optimization approach. In the first step, the shortest-distance tree within the looped network is identified using the Dijkstra graph theory algorithm, for which an extension is proposed to find the shortest-distance tree for multisource WDSs. In the second step, a nonlinear programming (NLP) solver is employed to optimize the pipe diameters for the shortest-distance tree (chords of the shortest-distance tree are allocated the minimum allowable pipe sizes). Finally, in the third step, the original looped water network is optimized using a differential evolution (DE) algorithm seeded with diameters in the proximity of the continuous pipe sizes obtained in step two. As such, the proposed optimization approach combines the traditional deterministic optimization technique of NLP with the emerging evolutionary algorithm DE via the proposed network decomposition. The proposed methodology has been tested on four looped WDSs with the number of decision variables ranging from 21 to 454. Results obtained show the proposed approach is able to find optimal solutions with significantly less computational effort than other optimization techniques.

  8. Evaluation of an Agricultural Meteorological Disaster Based on Multiple Criterion Decision Making and Evolutionary Algorithm

    PubMed Central

    Yu, Xiaobing; Yu, Xianrui; Lu, Yiqun

    2018-01-01

    The evaluation of a meteorological disaster can be regarded as a multiple-criteria decision making problem because it involves many indexes. Firstly, a comprehensive indexing system for an agricultural meteorological disaster is proposed, which includes the disaster rate, the inundated rate, and the complete loss rate. Following this, the relative weights of the three criteria are acquired using a novel proposed evolutionary algorithm. The proposed algorithm consists of a differential evolution algorithm and an evolution strategy. Finally, a novel evaluation model, based on the proposed algorithm and the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), is presented to estimate the agricultural meteorological disaster of 2008 in China. The geographic information system (GIS) technique is employed to depict the disaster. The experimental results demonstrated that the agricultural meteorological disaster of 2008 was very serious, especially in Hunan and Hubei provinces. Some useful suggestions are provided to relieve agriculture meteorological disasters. PMID:29597243

  9. A simple and effective procedure for treating burn contractures: releasing incision and quadra Z technique.

    PubMed

    Sen, Cenk; Karacalar, Ahmet; Agir, Hakan; Dinar, Serkan; Isil, Eda; Iscen, Deniz

    2007-03-01

    Burn contractures particularly involving the joints are challenging problems which might cause severe functional impairments. Many surgical techniques have been described for use, however, an ideal method yet to be found. Releasing incision is the most common and effective way to release the wide and severe contractures but it has some drawbacks. We propose a releasing incision technique combined with four Z plasty incisions to overcome the disadvantages of traditional releasing incision technique. We successfully used our releasing incision and quadra Z technique on seven consecutive patients with burn contractures between 2003 and 2005. We modified the classical releasing incision technique by adding four Z plasties; two of them with a common base on each corner of the incision line. In this technique, limitation of the webbing following the incision is made possible by the transposed flaps and unnecessary lateral extension of the incision and the defect was avoided, i.e. maximum release gain with minimal defect was provided. Satisfactory results were achieved in seven patients treated with this technique due to significant burn contractures between 2003 and 2005 with no significant complication. We propose this technique is suitable in all patients with severe burn contractures who require releasing incision and grafting.

  10. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  11. Quantum memory with a controlled homogeneous splitting

    NASA Astrophysics Data System (ADS)

    Hétet, G.; Wilkowski, D.; Chanelière, T.

    2013-04-01

    We propose a quantum memory protocol where an input light field can be stored onto and released from a single ground state atomic ensemble by controlling dynamically the strength of an external static and homogeneous field. The technique relies on the adiabatic following of a polaritonic excitation onto a state for which the forward collective radiative emission is forbidden. The resemblance with the archetypal electromagnetically induced transparency is only formal because no ground state coherence-based slow-light propagation is considered here. As compared to the other grand category of protocols derived from the photon-echo technique, our approach only involves a homogeneous static field. We discuss two physical situations where the effect can be observed, and show that in the limit where the excited state lifetime is longer than the storage time; the protocols are perfectly efficient and noise free. We compare the technique with other quantum memories, and propose atomic systems where the experiment can be realized.

  12. Bag of Lines (BoL) for Improved Aerial Scene Representation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sridharan, Harini; Cheriyadat, Anil M.

    2014-09-22

    Feature representation is a key step in automated visual content interpretation. In this letter, we present a robust feature representation technique, referred to as bag of lines (BoL), for high-resolution aerial scenes. The proposed technique involves extracting and compactly representing low-level line primitives from the scene. The compact scene representation is generated by counting the different types of lines representing various linear structures in the scene. Through extensive experiments, we show that the proposed scene representation is invariant to scale changes and scene conditions and can discriminate urban scene categories accurately. We compare the BoL representation with the popular scalemore » invariant feature transform (SIFT) and Gabor wavelets for their classification and clustering performance on an aerial scene database consisting of images acquired by sensors with different spatial resolutions. The proposed BoL representation outperforms the SIFT- and Gabor-based representations.« less

  13. Correction to hill (2005).

    PubMed

    Hill, Clara E

    2006-01-01

    Reports an error in "Therapist Techniques, Client Involvement, and the Therapeutic Relationship: Inextricably Intertwined in the Therapy Process" by Clara E. Hill (Psychotherapy: Theory, Research, Practice, Training, 2005 Win, Vol 42(4), 431-442). An author's name was incorrectly spelled in a reference. The correct reference is presented. (The following abstract of the original article appeared in record 2006-03309-003.) I propose that therapist techniques, client involvement, and the therapeutic relationship are inextricably intertwined and need to be considered together in any discussion of the therapy process. Furthermore, I present a pantheoretical model of how these three variables evolve over four stages of successful therapy: initial impression formation, beginning the therapy (involves the components of facilitating client exploration and developing case conceptualization and treatment strategies), the core work of therapy (involves the components of theory-relevant tasks and overcoming obstacles), and termination. Theoretical propositions as well as implications for training and research are presented. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  14. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  15. A mathematical model for the generation and control of a pH gradient in an immobilized enzyme system involving acid generation.

    PubMed

    Chen, G; Fournier, R L; Varanasi, S

    1998-02-20

    An optimal pH control technique has been developed for multistep enzymatic synthesis reactions where the optimal pH differs by several units for each step. This technique separates an acidic environment from a basic environment by the hydrolysis of urea within a thin layer of immobilized urease. With this technique, a two-step enzymatic reaction can take place simultaneously, in proximity to each other, and at their respective optimal pH. Because a reaction system involving an acid generation represents a more challenging test of this pH control technique, a number of factors that affect the generation of such a pH gradient are considered in this study. The mathematical model proposed is based on several simplifying assumptions and represents a first attempt to provide an analysis of this complex problem. The results show that, by choosing appropriate parameters, the pH control technique still can generate the desired pH gradient even if there is an acid-generating reaction in the system. Copyright 1998 John Wiley & Sons, Inc.

  16. Automatic Parameterization Strategy for Cardiac Electrophysiology Simulations

    PubMed Central

    Costa, Caroline Mendonca; Hoetzl, Elena; Rocha, Bernardo Martins; Prassl, Anton J; Plank, Gernot

    2014-01-01

    Driven by recent advances in medical imaging, image segmentation and numerical techniques, computer models of ventricular electrophysiology account for increasingly finer levels of anatomical and biophysical detail. However, considering the large number of model parameters involved parameterization poses a major challenge. A minimum requirement in combined experimental and modeling studies is to achieve good agreement in activation and repolarization sequences between model and experiment or patient data. In this study, we propose basic techniques which aid in determining bidomain parameters to match activation sequences. An iterative parameterization algorithm is implemented which determines appropriate bulk conductivities which yield prescribed velocities. In addition, a method is proposed for splitting the computed bulk conductivities into individual bidomain conductivities by prescribing anisotropy ratios. PMID:24729986

  17. Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers

    NASA Astrophysics Data System (ADS)

    Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille

    This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.

  18. Temporalis myofascial flap transfer into the oral cavity without zygomatic arch osteotomy

    PubMed Central

    Tauro, David P.; Mishra, Madan; Singh, Gaurav

    2013-01-01

    Among plethora of options, the temporalis myofascial flap remains a workhorse for the maxillofacial reconstruction. The inherent advantages include reliable vascularity, adequate size, and proximity to the defect. Although contemporary surgical techniques provide fair surgical results with low rate of complications, their intraoral transposition involve additional surgical trauma by intentional fracturing of the zygomatic arch. We have proposed herein a simpler technique of temporalis myofascial flap transposition into the oral cavity without zygomatic arch osteotomy. PMID:24665182

  19. Photovoltaic pilot projects in the European community

    NASA Astrophysics Data System (ADS)

    Treble, F. C.; Grassi, G.; Schnell, W.

    The paper presents proposals received for the construction of photovoltaic pilot plants as part of the Commission of the European Communities' second 4-year solar energy R and D program. The proposed plants range from 30 to 300 kWp and cover a variety of applications including rural electrification, water pumping, desalination, dairy farming, factories, hospitals, schools and vacation centers. Fifteen projects will be accepted with a total generating capacity of 1 MWp, with preference given to those projects involving the development of new techniques, components and systems.

  20. Hyperbolic heat conduction problems involving non-Fourier effects - Numerical simulations via explicit Lax-Wendroff/Taylor-Galerkin finite element formulations

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Namburu, Raju R.

    1989-01-01

    Numerical simulations are presented for hyperbolic heat-conduction problems that involve non-Fourier effects, using explicit, Lax-Wendroff/Taylor-Galerkin FEM formulations as the principal computational tool. Also employed are smoothing techniques which stabilize the numerical noise and accurately predict the propagating thermal disturbances. The accurate capture of propagating thermal disturbances at characteristic time-step values is achieved; numerical test cases are presented which validate the proposed hyperbolic heat-conduction problem concepts.

  1. Stakeholder Partnerships as Collaborative Policymaking: Evaluation Criteria Applied to Watershed Management in California and Washington

    ERIC Educational Resources Information Center

    Leach, William D.; Pelkey, Neil W.; Sabatier, Paul A.

    2002-01-01

    Public policymaking and implementation in the United States are increasingly handled through local, consensus-seeking partnerships involving most affected stakeholders. This paper formalizes the concept of a stakeholder partnership, and proposes techniques for using interviews, surveys, and documents to measure each of six evaluation criteria.…

  2. Scalable and Cost-Effective Assignment of Mobile Crowdsensing Tasks Based on Profiling Trends and Prediction: The ParticipAct Living Lab Experience

    PubMed Central

    Bellavista, Paolo; Corradi, Antonio; Foschini, Luca; Ianniello, Raffaele

    2015-01-01

    Nowadays, sensor-rich smartphones potentially enable the harvesting of huge amounts of valuable sensing data in urban environments, by opportunistically involving citizens to play the role of mobile virtual sensors to cover Smart City areas of interest. This paper proposes an in-depth study of the challenging technical issues related to the efficient assignment of Mobile Crowd Sensing (MCS) data collection tasks to volunteers in a crowdsensing campaign. In particular, the paper originally describes how to increase the effectiveness of the proposed sensing campaigns through the inclusion of several new facilities, including accurate participant selection algorithms able to profile and predict user mobility patterns, gaming techniques, and timely geo-notification. The reported results show the feasibility of exploiting profiling trends/prediction techniques from volunteers’ behavior; moreover, they quantitatively compare different MCS task assignment strategies based on large-scale and real MCS data campaigns run in the ParticipAct living lab, an ongoing MCS real-world experiment that involved more than 170 students of the University of Bologna for more than one year. PMID:26263985

  3. Graph-cut based discrete-valued image reconstruction.

    PubMed

    Tuysuzoglu, Ahmet; Karl, W Clem; Stojanovic, Ivana; Castañòn, David; Ünlü, M Selim

    2015-05-01

    Efficient graph-cut methods have been used with great success for labeling and denoising problems occurring in computer vision. Unfortunately, the presence of linear image mappings has prevented the use of these techniques in most discrete-amplitude image reconstruction problems. In this paper, we develop a graph-cut based framework for the direct solution of discrete amplitude linear image reconstruction problems cast as regularized energy function minimizations. We first analyze the structure of discrete linear inverse problem cost functions to show that the obstacle to the application of graph-cut methods to their solution is the variable mixing caused by the presence of the linear sensing operator. We then propose to use a surrogate energy functional that overcomes the challenges imposed by the sensing operator yet can be utilized efficiently in existing graph-cut frameworks. We use this surrogate energy functional to devise a monotonic iterative algorithm for the solution of discrete valued inverse problems. We first provide experiments using local convolutional operators and show the robustness of the proposed technique to noise and stability to changes in regularization parameter. Then we focus on nonlocal, tomographic examples where we consider limited-angle data problems. We compare our technique with state-of-the-art discrete and continuous image reconstruction techniques. Experiments show that the proposed method outperforms state-of-the-art techniques in challenging scenarios involving discrete valued unknowns.

  4. [Surgical management of deep endometriosis with colorectal involvement: CNGOF-HAS Endometriosis Guidelines].

    PubMed

    Ballester, M; Roman, H

    2018-03-01

    Deep endometriosis with colorectal involvement is considered one of the most severe forms of the disease due to its impact on patients' quality of life and fertility but also by the difficulties encountered by the clinicians when proposing a therapeutic strategy. Although the literature is very rich, evidence based medicine remains poor explaining the great heterogeneity concerning the management of such patients. Surgery therefore remains a therapeutic option. It improves the intensity of gynecological, digestive and general symptoms and the quality of life. Concerning the surgical approach, it appears that laparoscopy should be the first option; the laparoscopic robot-assisted route can also be proposed. The techniques of rectal shaving, discoid resection and segmental resection are the three techniques used for surgical excision of colorectal endometriosis. The parameters taken into account for the use of either technique are: the surgeon's experience, the depth of infiltration of the lesion within the rectosigmoid wall, the lesion size and circumference, multifocality and the distance of the lesion from the anal margin. In the case of deep endometriosis with colorectal involvement, performing an incomplete surgery increases the rate of pain recurrence and decreases postoperative fertility. In case of surgery for colorectal endometriosis, pregnancy rates are similar to those obtained after ART in non-operated patients. Existing data are insufficient to formally recommend first line surgery or ART in infertile patients with colorectal endometriosis. The surgery for colorectal endometriosis exposes to a risk of postoperative complications and recurrence of which the patients should be informed preoperatively. Copyright © 2018. Published by Elsevier Masson SAS.

  5. Concrete Condition Assessment Using Impact-Echo Method and Extreme Learning Machines

    PubMed Central

    Zhang, Jing-Kui; Yan, Weizhong; Cui, De-Mi

    2016-01-01

    The impact-echo (IE) method is a popular non-destructive testing (NDT) technique widely used for measuring the thickness of plate-like structures and for detecting certain defects inside concrete elements or structures. However, the IE method is not effective for full condition assessment (i.e., defect detection, defect diagnosis, defect sizing and location), because the simple frequency spectrum analysis involved in the existing IE method is not sufficient to capture the IE signal patterns associated with different conditions. In this paper, we attempt to enhance the IE technique and enable it for full condition assessment of concrete elements by introducing advanced machine learning techniques for performing comprehensive analysis and pattern recognition of IE signals. Specifically, we use wavelet decomposition for extracting signatures or features out of the raw IE signals and apply extreme learning machine, one of the recently developed machine learning techniques, as classification models for full condition assessment. To validate the capabilities of the proposed method, we build a number of specimens with various types, sizes, and locations of defects and perform IE testing on these specimens in a lab environment. Based on analysis of the collected IE signals using the proposed machine learning based IE method, we demonstrate that the proposed method is effective in performing full condition assessment of concrete elements or structures. PMID:27023563

  6. Microfabrication Method using a Combination of Local Ion Implantation and Magnetorheological Finishing

    NASA Astrophysics Data System (ADS)

    Han, Jin; Kim, Jong-Wook; Lee, Hiwon; Min, Byung-Kwon; Lee, Sang Jo

    2009-02-01

    A new microfabrication method that combines localized ion implantation and magnetorheological finishing is proposed. The proposed technique involves two steps. First, selected regions of a silicon wafer are irradiated with gallium ions by using a focused ion beam system. The mechanical properties of the irradiated regions are altered as a result of the ion implantation. Second, the wafer is processed by using a magnetorheological finishing method. During the finishing process, the regions not implanted with ion are preferentially removed. The material removal rate difference is utilized for microfabrication. The mechanisms of the proposed method are discussed, and applications are presented.

  7. Multivariate curve resolution of incomplete fused multiset data from chromatographic and spectrophotometric analyses for drug photostability studies.

    PubMed

    De Luca, Michele; Ragno, Gaetano; Ioele, Giuseppina; Tauler, Romà

    2014-07-21

    An advanced and powerful chemometric approach is proposed for the analysis of incomplete multiset data obtained by fusion of hyphenated liquid chromatographic DAD/MS data with UV spectrophotometric data from acid-base titration and kinetic degradation experiments. Column- and row-wise augmented data blocks were combined and simultaneously processed by means of a new version of the multivariate curve resolution-alternating least squares (MCR-ALS) technique, including the simultaneous analysis of incomplete multiset data from different instrumental techniques. The proposed procedure was applied to the detailed study of the kinetic photodegradation process of the amiloride (AML) drug. All chemical species involved in the degradation and equilibrium reactions were resolved and the pH dependent kinetic pathway described. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Vis-NIR spectrometric determination of Brix and sucrose in sugar production samples using kernel partial least squares with interval selection based on the successive projections algorithm.

    PubMed

    de Almeida, Valber Elias; de Araújo Gomes, Adriano; de Sousa Fernandes, David Douglas; Goicoechea, Héctor Casimiro; Galvão, Roberto Kawakami Harrop; Araújo, Mario Cesar Ugulino

    2018-05-01

    This paper proposes a new variable selection method for nonlinear multivariate calibration, combining the Successive Projections Algorithm for interval selection (iSPA) with the Kernel Partial Least Squares (Kernel-PLS) modelling technique. The proposed iSPA-Kernel-PLS algorithm is employed in a case study involving a Vis-NIR spectrometric dataset with complex nonlinear features. The analytical problem consists of determining Brix and sucrose content in samples from a sugar production system, on the basis of transflectance spectra. As compared to full-spectrum Kernel-PLS, the iSPA-Kernel-PLS models involve a smaller number of variables and display statistically significant superiority in terms of accuracy and/or bias in the predictions. Published by Elsevier B.V.

  9. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  10. 78 FR 67384 - 60-Day Notice of Proposed Information Collection: FHA-Insured Mortgage Loan Servicing Involving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... hearing or speech impairments may access this number through TTY by calling the toll-free Federal Relay... calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available documents submitted to... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  11. Effects of motion on jet exhaust noise from aircraft

    NASA Technical Reports Server (NTRS)

    Chun, K. S.; Berman, C. H.; Cowan, S. J.

    1976-01-01

    The various problems involved in the evaluation of the jet noise field prevailing between an observer on the ground and an aircraft in flight in a typical takeoff or landing approach pattern were studied. Areas examined include: (1) literature survey and preliminary investigation, (2) propagation effects, (3) source alteration effects, and (4) investigation of verification techniques. Sixteen problem areas were identified and studied. Six follow-up programs were recommended for further work. The results and the proposed follow-on programs provide a practical general technique for predicting flyover jet noise for conventional jet nozzles.

  12. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  13. Nearest greedy for solving the waste collection vehicle routing problem: A case study

    NASA Astrophysics Data System (ADS)

    Mat, Nur Azriati; Benjamin, Aida Mauziah; Abdul-Rahman, Syariza; Wibowo, Antoni

    2017-11-01

    This paper presents a real case study pertaining to an issue related to waste collection in the northern part of Malaysia by using a constructive heuristic algorithm known as the Nearest Greedy (NG) technique. This technique has been widely used to devise initial solutions for issues concerning vehicle routing. Basically, the waste collection cycle involves the following steps: i) each vehicle starts from a depot, ii) visits a number of customers to collect waste, iii) unloads waste at the disposal site, and lastly, iv) returns to the depot. Moreover, the sample data set used in this paper consisted of six areas, where each area involved up to 103 customers. In this paper, the NG technique was employed to construct an initial route for each area. The solution proposed from the technique was compared with the present vehicle routes implemented by a waste collection company within the city. The comparison results portrayed that NG offered better vehicle routes with a 11.07% reduction of the total distance traveled, in comparison to the present vehicle routes.

  14. Integrating multi-criteria techniques with geographical information systems in waste facility location to enhance public participation.

    PubMed

    Higgs, Gary

    2006-04-01

    Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.

  15. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  16. Multiple directed graph large-class multi-spectral processor

    NASA Technical Reports Server (NTRS)

    Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki

    1988-01-01

    Numerical analysis techniques for the interpretation of high-resolution imaging-spectrometer data are described and demonstrated. The method proposed involves the use of (1) a hierarchical classifier with a tree structure generated automatically by a Fisher linear-discriminant-function algorithm and (2) a novel multiple-directed-graph scheme which reduces the local maxima and the number of perturbations required. Results for a 500-class test problem involving simulated imaging-spectrometer data are presented in tables and graphs; 100-percent-correct classification is achieved with an improvement factor of 5.

  17. Obfuscatable multi-recipient re-encryption for secure privacy-preserving personal health record services.

    PubMed

    Shi, Yang; Fan, Hongfei; Xiong, Guoyue

    2015-01-01

    With the rapid development of cloud computing techniques, it is attractive for personal health record (PHR) service providers to deploy their PHR applications and store the personal health data in the cloud. However, there could be a serious privacy leakage if the cloud-based system is intruded by attackers, which makes it necessary for the PHR service provider to encrypt all patients' health data on cloud servers. Existing techniques are insufficiently secure under circumstances where advanced threats are considered, or being inefficient when many recipients are involved. Therefore, the objectives of our solution are (1) providing a secure implementation of re-encryption in white-box attack contexts and (2) assuring the efficiency of the implementation even in multi-recipient cases. We designed the multi-recipient re-encryption functionality by randomness-reusing and protecting the implementation by obfuscation. The proposed solution is secure even in white-box attack contexts. Furthermore, a comparison with other related work shows that the computational cost of the proposed solution is lower. The proposed technique can serve as a building block for supporting secure, efficient and privacy-preserving personal health record service systems.

  18. Forty-five degree cutting septoplasty.

    PubMed

    Hsiao, Yen-Chang; Chang, Chun-Shin; Chuang, Shiow-Shuh; Kolios, Georgios; Abdelrahman, Mohamed

    2016-01-01

    The crooked nose represents a challenge for rhinoplasty surgeons, and many methods have been proposed for management; however, there is no ideal method for treatment. Accordingly, the 45° cutting septoplasty technique, which involves a 45° cut at the junction of the L-shaped strut and repositioning it to achieve a straight septum is proposed. From October 2010 to September 2014, 43 patients underwent the 45° cutting septoplasty technique. There were 28 men and 15 women, with ages ranging from 20 to 58 years (mean, 33 years). Standardized photographs were obtained at every visit. Established photogrammetric parameters were used to describe the degree of correction: Correction rate = (preoperative total deviation - postoperative residual deviation)/preoperative total deviation × 100% was proposed. The mean follow-up period for all patients was 12.3 months. The mean preoperative deviation was 64.3° and the mean postoperative deviation was 2.7°; the overall correction rate was 95.8%. One patient experienced composite implant deviation two weeks postoperatively and underwent revision rhinoplasty. There were no infections, hematomas or postoperative bleeding. Based on the clinical observation of all patients during the follow-up period, the 45° cutting septoplasty technique was shown to be effective for the treatment of crooked nose.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumway, R.H.; McQuarrie, A.D.

    Robust statistical approaches to the problem of discriminating between regional earthquakes and explosions are developed. We compare linear discriminant analysis using descriptive features like amplitude and spectral ratios with signal discrimination techniques using the original signal waveforms and spectral approximations to the log likelihood function. Robust information theoretic techniques are proposed and all methods are applied to 8 earthquakes and 8 mining explosions in Scandinavia and to an event from Novaya Zemlya of unknown origin. It is noted that signal discrimination approaches based on discrimination information and Renyi entropy perform better in the test sample than conventional methods based onmore » spectral ratios involving the P and S phases. Two techniques for identifying the ripple-firing pattern for typical mining explosions are proposed and shown to work well on simulated data and on several Scandinavian earthquakes and explosions. We use both cepstral analysis in the frequency domain and a time domain method based on the autocorrelation and partial autocorrelation functions. The proposed approach strips off underlying smooth spectral and seasonal spectral components corresponding to the echo pattern induced by two simple ripple-fired models. For two mining explosions, a pattern is identified whereas for two earthquakes, no pattern is evident.« less

  20. A general procedure to generate models for urban environmental-noise pollution using feature selection and machine learning methods.

    PubMed

    Torija, Antonio J; Ruiz, Diego P

    2015-02-01

    The prediction of environmental noise in urban environments requires the solution of a complex and non-linear problem, since there are complex relationships among the multitude of variables involved in the characterization and modelling of environmental noise and environmental-noise magnitudes. Moreover, the inclusion of the great spatial heterogeneity characteristic of urban environments seems to be essential in order to achieve an accurate environmental-noise prediction in cities. This problem is addressed in this paper, where a procedure based on feature-selection techniques and machine-learning regression methods is proposed and applied to this environmental problem. Three machine-learning regression methods, which are considered very robust in solving non-linear problems, are used to estimate the energy-equivalent sound-pressure level descriptor (LAeq). These three methods are: (i) multilayer perceptron (MLP), (ii) sequential minimal optimisation (SMO), and (iii) Gaussian processes for regression (GPR). In addition, because of the high number of input variables involved in environmental-noise modelling and estimation in urban environments, which make LAeq prediction models quite complex and costly in terms of time and resources for application to real situations, three different techniques are used to approach feature selection or data reduction. The feature-selection techniques used are: (i) correlation-based feature-subset selection (CFS), (ii) wrapper for feature-subset selection (WFS), and the data reduction technique is principal-component analysis (PCA). The subsequent analysis leads to a proposal of different schemes, depending on the needs regarding data collection and accuracy. The use of WFS as the feature-selection technique with the implementation of SMO or GPR as regression algorithm provides the best LAeq estimation (R(2)=0.94 and mean absolute error (MAE)=1.14-1.16 dB(A)). Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Detrended fluctuation analysis for major depressive disorder.

    PubMed

    Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah

    2015-01-01

    Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.

  2. Optimal PMU placement using topology transformation method in power systems.

    PubMed

    Rahman, Nadia H A; Zobaa, Ahmed F

    2016-09-01

    Optimal phasor measurement units (PMUs) placement involves the process of minimizing the number of PMUs needed while ensuring the entire power system completely observable. A power system is identified observable when the voltages of all buses in the power system are known. This paper proposes selection rules for topology transformation method that involves a merging process of zero-injection bus with one of its neighbors. The result from the merging process is influenced by the selection of bus selected to merge with the zero-injection bus. The proposed method will determine the best candidate bus to merge with zero-injection bus according to the three rules created in order to determine the minimum number of PMUs required for full observability of the power system. In addition, this paper also considered the case of power flow measurements. The problem is formulated as integer linear programming (ILP). The simulation for the proposed method is tested by using MATLAB for different IEEE bus systems. The explanation of the proposed method is demonstrated by using IEEE 14-bus system. The results obtained in this paper proved the effectiveness of the proposed method since the number of PMUs obtained is comparable with other available techniques.

  3. Forecasting conditional climate-change using a hybrid approach

    USGS Publications Warehouse

    Esfahani, Akbar Akbari; Friedel, Michael J.

    2014-01-01

    A novel approach is proposed to forecast the likelihood of climate-change across spatial landscape gradients. This hybrid approach involves reconstructing past precipitation and temperature using the self-organizing map technique; determining quantile trends in the climate-change variables by quantile regression modeling; and computing conditional forecasts of climate-change variables based on self-similarity in quantile trends using the fractionally differenced auto-regressive integrated moving average technique. The proposed modeling approach is applied to states (Arizona, California, Colorado, Nevada, New Mexico, and Utah) in the southwestern U.S., where conditional forecasts of climate-change variables are evaluated against recent (2012) observations, evaluated at a future time period (2030), and evaluated as future trends (2009–2059). These results have broad economic, political, and social implications because they quantify uncertainty in climate-change forecasts affecting various sectors of society. Another benefit of the proposed hybrid approach is that it can be extended to any spatiotemporal scale providing self-similarity exists.

  4. A New Data Representation Based on Training Data Characteristics to Extract Drug Name Entity in Medical Text

    PubMed Central

    Basaruddin, T.

    2016-01-01

    One essential task in information extraction from the medical corpus is drug name recognition. Compared with text sources come from other domains, the medical text mining poses more challenges, for example, more unstructured text, the fast growing of new terms addition, a wide range of name variation for the same drug, the lack of labeled dataset sources and external knowledge, and the multiple token representations for a single drug name. Although many approaches have been proposed to overwhelm the task, some problems remained with poor F-score performance (less than 0.75). This paper presents a new treatment in data representation techniques to overcome some of those challenges. We propose three data representation techniques based on the characteristics of word distribution and word similarities as a result of word embedding training. The first technique is evaluated with the standard NN model, that is, MLP. The second technique involves two deep network classifiers, that is, DBN and SAE. The third technique represents the sentence as a sequence that is evaluated with a recurrent NN model, that is, LSTM. In extracting the drug name entities, the third technique gives the best F-score performance compared to the state of the art, with its average F-score being 0.8645. PMID:27843447

  5. A De-centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper, we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is decentralized, scalable, and overlaps the node coordination time with that of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  6. A De-Centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is de-centralized, scalable, and overlaps the node coordination time of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  7. A proposal to encourage intuitive learning in a senior-level analogue electronics course

    NASA Astrophysics Data System (ADS)

    Berjano, E.; Lozano-Nieto, A.

    2011-05-01

    One of the most important issues in the reorganisation of engineering education is to consider new pedagogical techniques to help students develop skills and an adaptive expertise. This expertise consists of being able to recognise the nature of a problem intuitively, and also recognising recurring patterns in different types of problems. In the particular case of analogue electronics, an additional difficulty seems to be that understanding involves both analytic skills and an intuitive grasp of circuit characteristics. This paper presents a proposal to help senior students to think intuitively in order to identify the common issue involved in a group of problems of analogue electronics and build an abstract concept based on, for example, a theory or a mathematical model in order to use it to solve future problems. The preliminary results suggest that this proposal could be useful to promote intuitive reasoning in analogue electronics courses. The experience would later be useful to graduates in analytically solving new types of problems or in designing new electronic circuits.

  8. Non-randomized response model for sensitive survey with noncompliance.

    PubMed

    Wu, Qin; Tang, Man-Lai

    2016-12-01

    Collecting representative data on sensitive issues has long been problematic and challenging in public health prevalence investigation (e.g. non-suicidal self-injury), medical research (e.g. drug habits), social issue studies (e.g. history of child abuse), and their interdisciplinary studies (e.g. premarital sexual intercourse). Alternative data collection techniques that can be adopted to study sensitive questions validly become more important and necessary. As an alternative to the famous Warner randomized response model, non-randomized response triangular model has recently been developed to encourage participants to provide truthful responses in surveys involving sensitive questions. Unfortunately, both randomized and non-randomized response models could underestimate the proportion of subjects with the sensitive characteristic as some respondents do not believe that these techniques can protect their anonymity. As a result, some authors hypothesized that lack of trust and noncompliance should be highest among those who have the most to lose and the least to use for the anonymity provided by using these techniques. Some researchers noticed the existence of noncompliance and proposed new models to measure noncompliance in order to get reliable information. However, all proposed methods were based on randomized response models which require randomizing devices, restrict the survey to only face-to-face interview and are lack of reproductivity. Taking the noncompliance into consideration, we introduce new non-randomized response techniques in which no covariate is required. Asymptotic properties of the proposed estimates for sensitive characteristic as well as noncompliance probabilities are developed. Our proposed techniques are empirically shown to yield accurate estimates for both sensitive and noncompliance probabilities. A real example about premarital sex among university students is used to demonstrate our methodologies. © The Author(s) 2014.

  9. Laser Doppler measurement techniques for spacecraft

    NASA Technical Reports Server (NTRS)

    Kinman, Peter W.; Gagliardi, Robert M.

    1986-01-01

    Two techniques are proposed for using laser links to measure the relative radial velocity of two spacecraft. The first technique determines the relative radial velocity from a measurement of the two-way Doppler shift on a transponded radio-frequency subcarrier. The subcarrier intensity-modulates reciprocating laser beams. The second technique determines the relative radial velocity from a measurement of the two-way Doppler shift on an optical frequency carrier which is transponded between spacecraft using optical Costas loops. The first technique might be used in conjunction with noncoherent optical communications, while the second technique is compatible with coherent optical communications. The first technique simultaneously exploits the diffraction advantage of laser beams and the maturity of radio-frequency phase-locked loop technology. The second technique exploits both the diffraction advantage of laser beams and the large Doppler effect at optical frequencies. The second technique has the potential for greater accuracy; unfortunately, it is more difficult to implement since it involves optical Costas loops.

  10. Combating Memory Corruption Attacks On Scada Devices

    NASA Astrophysics Data System (ADS)

    Bellettini, Carlo; Rrushi, Julian

    Memory corruption attacks on SCADA devices can cause significant disruptions to control systems and the industrial processes they operate. However, despite the presence of numerous memory corruption vulnerabilities, few, if any, techniques have been proposed for addressing the vulnerabilities or for combating memory corruption attacks. This paper describes a technique for defending against memory corruption attacks by enforcing logical boundaries between potentially hostile data and safe data in protected processes. The technique encrypts all input data using random keys; the encrypted data is stored in main memory and is decrypted according to the principle of least privilege just before it is processed by the CPU. The defensive technique affects the precision with which attackers can corrupt control data and pure data, protecting against code injection and arc injection attacks, and alleviating problems posed by the incomparability of mitigation techniques. An experimental evaluation involving the popular Modbus protocol demonstrates the feasibility and efficiency of the defensive technique.

  11. When to Intervene in Selective Mutism: The Multimodal Treatment of a Case of Persistent Selective Mutism.

    ERIC Educational Resources Information Center

    Powell, Shawn; Dalley, Mahlono

    1995-01-01

    An identification and treatment model differentiating transient mutism from persistent selective mutism is proposed. The case study of a six-year-old girl is presented, who was treated with a multimodal approach combining behavioral techniques with play therapy and family involvement. At posttreatment and follow-up, she was talking in a manner…

  12. Langevin Equation on Fractal Curves

    NASA Astrophysics Data System (ADS)

    Satin, Seema; Gangal, A. D.

    2016-07-01

    We analyze random motion of a particle on a fractal curve, using Langevin approach. This involves defining a new velocity in terms of mass of the fractal curve, as defined in recent work. The geometry of the fractal curve, plays an important role in this analysis. A Langevin equation with a particular model of noise is proposed and solved using techniques of the Fα-Calculus.

  13. Low-loss ultracompact optical power splitter using a multistep structure.

    PubMed

    Huang, Zhe; Chan, Hau Ping; Afsar Uddin, Mohammad

    2010-04-01

    We propose a low-loss ultracompact optical power splitter for broadband passive optical network applications. The design is based on a multistep structure involving a two-material (core/cladding) system. The performance of the proposed device was evaluated through the three-dimensional finite-difference beam propagation method. By using the proposed design, an excess loss of 0.4 dB was achieved at a full branching angle of 24 degrees. The wavelength-dependent loss was found to be less than 0.3 dB, and the polarization-dependent loss was less than 0.05 dB from O to L bands. The device offers the potential of being mass-produced using low-cost polymer-based embossing techniques.

  14. Negotiating behavioural change: therapists' proposal turns in Cognitive Behavioural Therapy.

    PubMed

    Ekberg, Katie; Lecouteur, Amanda

    2012-01-01

    Cognitive behavioural therapy (CBT) is an internationally recognised method for treating depression. However, many of the techniques involved in CBT are accomplished within the therapy interaction in diverse ways, and with varying consequences for the trajectory of therapy session. This paper uses conversation analysis to examine some standard ways in which therapists propose suggestions for behavioural change to clients attending CBT sessions for depression in Australia. Therapists' proposal turns displayed their subordinate epistemic authority over the matter at hand, and emphasised a high degree of optionality on behalf of the client in accepting their suggestions. This practice was routinely accomplished via three standard proposal turns: (1) hedged recommendations; (2) interrogatives; and (3) information-giving. These proposal turns will be examined in relation to the negotiation of behavioural change, and the implications for CBT interactions between therapist and client will be discussed.

  15. Interactive segmentation of tongue contours in ultrasound video sequences using quality maps

    NASA Astrophysics Data System (ADS)

    Ghrenassia, Sarah; Ménard, Lucie; Laporte, Catherine

    2014-03-01

    Ultrasound (US) imaging is an effective and non invasive way of studying the tongue motions involved in normal and pathological speech, and the results of US studies are of interest for the development of new strategies in speech therapy. State-of-the-art tongue shape analysis techniques based on US images depend on semi-automated tongue segmentation and tracking techniques. Recent work has mostly focused on improving the accuracy of the tracking techniques themselves. However, occasional errors remain inevitable, regardless of the technique used, and the tongue tracking process must thus be supervised by a speech scientist who will correct these errors manually or semi-automatically. This paper proposes an interactive framework to facilitate this process. In this framework, the user is guided towards potentially problematic portions of the US image sequence by a segmentation quality map that is based on the normalized energy of an active contour model and automatically produced during tracking. When a problematic segmentation is identified, corrections to the segmented contour can be made on one image and propagated both forward and backward in the problematic subsequence, thereby improving the user experience. The interactive tools were tested in combination with two different tracking algorithms. Preliminary results illustrate the potential of the proposed framework, suggesting that the proposed framework generally improves user interaction time, with little change in segmentation repeatability.

  16. Teaching ethical analysis in occupational therapy.

    PubMed

    Haddad, A M

    1988-05-01

    Ethical decision making is a cognitive skill requiring education in ethical principles and an understanding of specific ethical issues. It is also a psychodynamic process involving personalities, values, opinions, and perceptions. This article proposes the use of case studies and role-playing techniques in teaching ethics in occupational therapy to supplement conventional methods of presenting ethical theories and principles. These two approaches invite students to discuss and analyze crucial issues in occupational therapy from a variety of viewpoints. Methodology of developing case studies and role-playing exercises are discussed. The techniques are evaluated and their application to the teaching of ethics is examined.

  17. [Is magnetic resonance imaging absolutely necessary for musculotendinous disease?].

    PubMed

    García González, Pedro; Meana Morís, Ana R

    2016-01-01

    Disorders of the musculoskeletal system are very prevalent in our society, especially those involving muscles and tendons, above all related to sports and work. These conditions are normally diagnosed and treated according to their clinical symptoms and signs, but a precise diagnosis is often necessary. The most widely used techniques for diagnosing these conditions are ultrasonography and magnetic resonance imaging. In this article, we propose ultrasonography as the technique of choice for diagnosing the most prevalent musculotendinous diseases, because it is accurate, versatile, dynamic, and effective. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  18. History Matters: Incremental Ontology Reasoning Using Modules

    NASA Astrophysics Data System (ADS)

    Cuenca Grau, Bernardo; Halaschek-Wiener, Christian; Kazakov, Yevgeny

    The development of ontologies involves continuous but relatively small modifications. Existing ontology reasoners, however, do not take advantage of the similarities between different versions of an ontology. In this paper, we propose a technique for incremental reasoning—that is, reasoning that reuses information obtained from previous versions of an ontology—based on the notion of a module. Our technique does not depend on a particular reasoning calculus and thus can be used in combination with any reasoner. We have applied our results to incremental classification of OWL DL ontologies and found significant improvement over regular classification time on a set of real-world ontologies.

  19. Behavior driven testing in ALMA telescope calibration software

    NASA Astrophysics Data System (ADS)

    Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang

    2016-07-01

    ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.

  20. A fast pulse design for parallel excitation with gridding conjugate gradient.

    PubMed

    Feng, Shuo; Ji, Jim

    2013-01-01

    Parallel excitation (pTx) is recognized as a crucial technique in high field MRI to address the transmit field inhomogeneity problem. However, it can be time consuming to design pTx pulses which is not desirable. In this work, we propose a pulse design with gridding conjugate gradient (CG) based on the small-tip-angle approximation. The two major time consuming matrix-vector multiplications are substituted by two operators which involves with FFT and gridding only. Simulation results have shown that the proposed method is 3 times faster than conventional method and the memory cost is reduced by 1000 times.

  1. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    NASA Astrophysics Data System (ADS)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  2. An intelligent content discovery technique for health portal content management.

    PubMed

    De Silva, Daswin; Burstein, Frada

    2014-04-23

    Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current.

  3. An intelligent clinical decision support system for patient-specific predictions to improve cervical intraepithelial neoplasia detection.

    PubMed

    Bountris, Panagiotis; Haritou, Maria; Pouliakis, Abraham; Margari, Niki; Kyrgiou, Maria; Spathis, Aris; Pappas, Asimakis; Panayiotides, Ioannis; Paraskevaidis, Evangelos A; Karakitsos, Petros; Koutsouris, Dimitrios-Dionyssios

    2014-01-01

    Nowadays, there are molecular biology techniques providing information related to cervical cancer and its cause: the human Papillomavirus (HPV), including DNA microarrays identifying HPV subtypes, mRNA techniques such as nucleic acid based amplification or flow cytometry identifying E6/E7 oncogenes, and immunocytochemistry techniques such as overexpression of p16. Each one of these techniques has its own performance, limitations and advantages, thus a combinatorial approach via computational intelligence methods could exploit the benefits of each method and produce more accurate results. In this article we propose a clinical decision support system (CDSS), composed by artificial neural networks, intelligently combining the results of classic and ancillary techniques for diagnostic accuracy improvement. We evaluated this method on 740 cases with complete series of cytological assessment, molecular tests, and colposcopy examination. The CDSS demonstrated high sensitivity (89.4%), high specificity (97.1%), high positive predictive value (89.4%), and high negative predictive value (97.1%), for detecting cervical intraepithelial neoplasia grade 2 or worse (CIN2+). In comparison to the tests involved in this study and their combinations, the CDSS produced the most balanced results in terms of sensitivity, specificity, PPV, and NPV. The proposed system may reduce the referral rate for colposcopy and guide personalised management and therapeutic interventions.

  4. An Intelligent Clinical Decision Support System for Patient-Specific Predictions to Improve Cervical Intraepithelial Neoplasia Detection

    PubMed Central

    Bountris, Panagiotis; Haritou, Maria; Pouliakis, Abraham; Margari, Niki; Kyrgiou, Maria; Spathis, Aris; Pappas, Asimakis; Panayiotides, Ioannis; Paraskevaidis, Evangelos A.; Karakitsos, Petros; Koutsouris, Dimitrios-Dionyssios

    2014-01-01

    Nowadays, there are molecular biology techniques providing information related to cervical cancer and its cause: the human Papillomavirus (HPV), including DNA microarrays identifying HPV subtypes, mRNA techniques such as nucleic acid based amplification or flow cytometry identifying E6/E7 oncogenes, and immunocytochemistry techniques such as overexpression of p16. Each one of these techniques has its own performance, limitations and advantages, thus a combinatorial approach via computational intelligence methods could exploit the benefits of each method and produce more accurate results. In this article we propose a clinical decision support system (CDSS), composed by artificial neural networks, intelligently combining the results of classic and ancillary techniques for diagnostic accuracy improvement. We evaluated this method on 740 cases with complete series of cytological assessment, molecular tests, and colposcopy examination. The CDSS demonstrated high sensitivity (89.4%), high specificity (97.1%), high positive predictive value (89.4%), and high negative predictive value (97.1%), for detecting cervical intraepithelial neoplasia grade 2 or worse (CIN2+). In comparison to the tests involved in this study and their combinations, the CDSS produced the most balanced results in terms of sensitivity, specificity, PPV, and NPV. The proposed system may reduce the referral rate for colposcopy and guide personalised management and therapeutic interventions. PMID:24812614

  5. RRW: repeated random walks on genome-scale protein networks for local cluster discovery

    PubMed Central

    Macropol, Kathy; Can, Tolga; Singh, Ambuj K

    2009-01-01

    Background We propose an efficient and biologically sensitive algorithm based on repeated random walks (RRW) for discovering functional modules, e.g., complexes and pathways, within large-scale protein networks. Compared to existing cluster identification techniques, RRW implicitly makes use of network topology, edge weights, and long range interactions between proteins. Results We apply the proposed technique on a functional network of yeast genes and accurately identify statistically significant clusters of proteins. We validate the biological significance of the results using known complexes in the MIPS complex catalogue database and well-characterized biological processes. We find that 90% of the created clusters have the majority of their catalogued proteins belonging to the same MIPS complex, and about 80% have the majority of their proteins involved in the same biological process. We compare our method to various other clustering techniques, such as the Markov Clustering Algorithm (MCL), and find a significant improvement in the RRW clusters' precision and accuracy values. Conclusion RRW, which is a technique that exploits the topology of the network, is more precise and robust in finding local clusters. In addition, it has the added flexibility of being able to find multi-functional proteins by allowing overlapping clusters. PMID:19740439

  6. Reducing Water/Hull Drag By Injecting Air Into Grooves

    NASA Technical Reports Server (NTRS)

    Reed, Jason C.; Bushnell, Dennis M.; Weinstein, Leonard M.

    1991-01-01

    Proposed technique for reduction of friction drag on hydrodynamic body involves use of grooves and combinations of surfactants to control motion of layer on surface of such body. Surface contains many rows of side-by-side, evenly spaced, longitudinal grooves. Dimensions of grooves and sharpnesses of tips in specific case depends on conditions of flow about vessel. Requires much less air than does microbubble-injection method.

  7. A Proposed Methodology to Classify Frontier Capital Markets

    DTIC Science & Technology

    2011-07-31

    but because it is the surest route to our common good.” -Inaugural Speech by President Barack Obama, Jan 2009 This project involves basic...machine learning. The algorithm consists of a unique binary classifier mechanism that combines three methods: k-Nearest Neighbors ( kNN ), ensemble...Through kNN Ensemble Classification Techniques E. Capital Market Classification Based on Capital Flows and Trading Architecture F. Horizontal

  8. Nonlinear adaptive control of grid-connected three-phase inverters for renewable energy applications

    NASA Astrophysics Data System (ADS)

    Mahdian-Dehkordi, N.; Namvar, M.; Karimi, H.; Piya, P.; Karimi-Ghartemani, M.

    2017-01-01

    Distributed generation (DG) units are often interfaced to the main grid using power electronic converters including voltage-source converters (VSCs). A VSC offers dc/ac power conversion, high controllability, and fast dynamic response. Because of nonlinearities, uncertainties, and system parameters' changes involved in the nature of a grid-connected renewable DG system, conventional linear control methods cannot completely and efficiently address all control objectives. In this paper, a nonlinear adaptive control scheme based on adaptive backstepping strategy is presented to control the operation of a grid-connected renewable DG unit. As compared to the popular vector control technique, the proposed controller offers smoother transient responses, and lower level of current distortions. The Lyapunov approach is used to establish global asymptotic stability of the proposed control system. Linearisation technique is employed to develop guidelines for parameters tuning of the controller. Extensive time-domain digital simulations are performed and presented to verify the performance of the proposed controller when employed in a VSC to control the operation of a two-stage DG unit and also that of a single-stage solar photovoltaic system. Desirable and superior performance of the proposed controller is observed.

  9. An integrated framework for detecting suspicious behaviors in video surveillance

    NASA Astrophysics Data System (ADS)

    Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi

    2014-03-01

    In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.

  10. Measurement of vibration using phase only correlation technique

    NASA Astrophysics Data System (ADS)

    Balachandar, S.; Vipin, K.

    2017-08-01

    A novel method for the measurement of vibration is proposed and demonstrated. The proposed experiment is based on laser triangulation: consists of line laser, object under test and a high speed camera remotely controlled by a software. Experiment involves launching a line-laser probe beam perpendicular to the axis of the vibrating object. The reflected probe beam is recorded by a high speed camera. The dynamic position of the line laser in camera plane is governed by the magnitude and frequency of the vibrating test-object. Using phase correlation technique the maximum distance travelled by the probe beam in CCD plane is measured in terms of pixels using MATLAB. An actual displacement of the object in mm is measured by calibration. Using displacement data with time, other vibration associated quantities such as acceleration, velocity and frequency are evaluated. The preliminary result of the proposed method is reported for acceleration from 1g to 3g, and from frequency 6Hz to 26Hz. The results are closely matching with its theoretical values. The advantage of the proposed method is that it is a non-destructive method and using phase correlation algorithm subpixel displacement in CCD plane can be measured with high accuracy.

  11. Fuzzy logic and optical correlation-based face recognition method for patient monitoring application in home video surveillance

    NASA Astrophysics Data System (ADS)

    Elbouz, Marwa; Alfalou, Ayman; Brosseau, Christian

    2011-06-01

    Home automation is being implemented into more and more domiciles of the elderly and disabled in order to maintain their independence and safety. For that purpose, we propose and validate a surveillance video system, which detects various posture-based events. One of the novel points of this system is to use adapted Vander-Lugt correlator (VLC) and joint-transfer correlator (JTC) techniques to make decisions on the identity of a patient and his three-dimensional (3-D) positions in order to overcome the problem of crowd environment. We propose a fuzzy logic technique to get decisions on the subject's behavior. Our system is focused on the goals of accuracy, convenience, and cost, which in addition does not require any devices attached to the subject. The system permits one to study and model subject responses to behavioral change intervention because several levels of alarm can be incorporated according different situations considered. Our algorithm performs a fast 3-D recovery of the subject's head position by locating eyes within the face image and involves a model-based prediction and optical correlation techniques to guide the tracking procedure. The object detection is based on (hue, saturation, value) color space. The system also involves an adapted fuzzy logic control algorithm to make a decision based on information given to the system. Furthermore, the principles described here are applicable to a very wide range of situations and robust enough to be implementable in ongoing experiments.

  12. Hardware realization of an SVM algorithm implemented in FPGAs

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Bazydło, Grzegorz; Szcześniak, Paweł

    2017-08-01

    The paper proposes a technique of hardware realization of a space vector modulation (SVM) of state function switching in matrix converter (MC), oriented on the implementation in a single field programmable gate array (FPGA). In MC the SVM method is based on the instantaneous space-vector representation of input currents and output voltages. The traditional computation algorithms usually involve digital signal processors (DSPs) which consumes the large number of power transistors (18 transistors and 18 independent PWM outputs) and "non-standard positions of control pulses" during the switching sequence. Recently, hardware implementations become popular since computed operations may be executed much faster and efficient due to nature of the digital devices (especially concurrency). In the paper, we propose a hardware algorithm of SVM computation. In opposite to the existing techniques, the presented solution applies COordinate Rotation DIgital Computer (CORDIC) method to solve the trigonometric operations. Furthermore, adequate arithmetic modules (that is, sub-devices) used for intermediate calculations, such as code converters or proper sectors selectors (for output voltages and input current) are presented in detail. The proposed technique has been implemented as a design described with the use of Verilog hardware description language. The preliminary results of logic implementation oriented on the Xilinx FPGA (particularly, low-cost device from Artix-7 family from Xilinx was used) are also presented.

  13. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  14. Dating human skeletal remains: investigating the viability of measuring the equilibrium between 210Po and 210Pb as a means of estimating the post-mortem interval.

    PubMed

    Swift, B

    1998-11-30

    Estimating the post-mortem interval in skeletal remains is a notoriously difficult task; forensic pathologists often rely heavily upon experience in recognising morphological appearances. Previous techniques have involved measuring physical or chemical changes within the hydroxyapatite matrix, radiocarbon dating and 90Sr dating, though no individual test has been advocated. Within this paper it is proposed that measuring the equilibrium between two naturally occurring radio-isotopes, 210Po and 210Pb, and comparison with post-mortem examination samples would produce a new method of dating human skeletal remains. Possible limitations exist, notably the effect of diagenesis, time limitations and relative cost, though this technique could provide a relatively accurate means of determining the post-mortem interval. It is therefore proposed that a large study be undertaken to provide a calibration scale against which bones uncovered can be dated.

  15. Distributed Adaptive Containment Control for a Class of Nonlinear Multiagent Systems With Input Quantization.

    PubMed

    Wang, Chenliang; Wen, Changyun; Hu, Qinglei; Wang, Wei; Zhang, Xiuyu

    2018-06-01

    This paper is devoted to distributed adaptive containment control for a class of nonlinear multiagent systems with input quantization. By employing a matrix factorization and a novel matrix normalization technique, some assumptions involving control gain matrices in existing results are relaxed. By fusing the techniques of sliding mode control and backstepping control, a two-step design method is proposed to construct controllers and, with the aid of neural networks, all system nonlinearities are allowed to be unknown. Moreover, a linear time-varying model and a similarity transformation are introduced to circumvent the obstacle brought by quantization, and the controllers need no information about the quantizer parameters. The proposed scheme is able to ensure the boundedness of all closed-loop signals and steer the containment errors into an arbitrarily small residual set. The simulation results illustrate the effectiveness of the scheme.

  16. Outfall siting with dye-buoy remote sensing of coastal circulation

    NASA Technical Reports Server (NTRS)

    Munday, J. C., Jr.; Welch, C. S.; Gordon, H. H.

    1978-01-01

    A dye-buoy remote sensing technique has been applied to estuarine siting problems that involve fine-scale circulation. Small hard cakes of sodium fluorescein and polyvinyl alcohol, in anchored buoys and low-windage current followers, dissolve to produce dye marks resolvable in 1:60,000 scale color and color infrared imagery. Lagrangian current vectors are determined from sequential photo coverage. Careful buoy placement reveals surface currents and submergence near fronts and convergence zones. The technique has been used in siting two sewage outfalls in Hampton Roads, Virginia: In case one, the outfall region during flood tide gathered floating materials in a convergence zone, which then acted as a secondary source during ebb; for better dispersion during ebb, the proposed outfall site was moved further offshore. In case two, flow during late flood was found to divide, with one half passing over shellfish beds; the proposed outfall site was consequently moved to keep effluent in the other half.

  17. Using an innovative criteria weighting tool for stakeholders involvement to rank MSW facility sites with the AHP.

    PubMed

    De Feo, Giovanni; De Gisi, Sabino

    2010-11-01

    The main aim of this study was to verify the efficacy of using an innovative criteria weighting tool (the "priority scale") for stakeholders involvement to rank a list of suitable municipal solid waste (MSW) facility sites with the multi-criteria decision-making (MCDM) technique known as analytic hierarchy process (AHP). One of the main objectives of the study was to verify the behaviour of the "priority scale" with both technical and non-technical decision-makers. All over the world, the siting of MSW treatment or disposal plants is a complex process involving politicians, technicians as well as citizens, where stakeholders who are not effectively involved strongly oppose (or even obstruct) the realization of new facilities. In this study, in order to pursue both the technical (select the best site) and social aims (all the stakeholders have to give their aware contribution), the use of the "priority scale" is suggested as a tool to easily collect non-contradictory criteria preferences by the various decision-makers. Every decision-maker filled in "priority scale", which was subsequently uploaded in the AHP tool in order to indirectly calculate the individual priority of alternatives given by each stakeholder (not using group aggregation techniques). The proposed method was applied to the siting of a composting plant in an area suffering from a serious MSW emergency, which has lasted for over 15 years, in the Campania Region, in Southern Italy. The best site (the "first choice") was taken as the one that appeared the most times at the first place of each decision-maker ranking list. The involved technical and non-technical decision-makers showed the same behaviour in (indirectly) selecting the best site as well as in terms of the most appraised criteria ("absence of areas of the highest value for natural habitats and species of plants and animals"). Moreover, they showed the same AHP inconsistency ratio as well as the same behaviour in comparison with a "balanced decision-maker" (who assigns identical weights to all the considered criteria). Therefore, the proposed criteria weighting tool could be widely as well as easily used for stakeholders involvement to rank MSW facility sites (or other kinds of alternatives) with the AHP or with other MCDM techniques, taking or not into consideration group aggregation methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Arterial Mechanical Motion Estimation Based on a Semi-Rigid Body Deformation Approach

    PubMed Central

    Guzman, Pablo; Hamarneh, Ghassan; Ros, Rafael; Ros, Eduardo

    2014-01-01

    Arterial motion estimation in ultrasound (US) sequences is a hard task due to noise and discontinuities in the signal derived from US artifacts. Characterizing the mechanical properties of the artery is a promising novel imaging technique to diagnose various cardiovascular pathologies and a new way of obtaining relevant clinical information, such as determining the absence of dicrotic peak, estimating the Augmentation Index (AIx), the arterial pressure or the arterial stiffness. One of the advantages of using US imaging is the non-invasive nature of the technique unlike Intra Vascular Ultra Sound (IVUS) or angiography invasive techniques, plus the relative low cost of the US units. In this paper, we propose a semi rigid deformable method based on Soft Bodies dynamics realized by a hybrid motion approach based on cross-correlation and optical flow methods to quantify the elasticity of the artery. We evaluate and compare different techniques (for instance optical flow methods) on which our approach is based. The goal of this comparative study is to identify the best model to be used and the impact of the accuracy of these different stages in the proposed method. To this end, an exhaustive assessment has been conducted in order to decide which model is the most appropriate for registering the variation of the arterial diameter over time. Our experiments involved a total of 1620 evaluations within nine simulated sequences of 84 frames each and the estimation of four error metrics. We conclude that our proposed approach obtains approximately 2.5 times higher accuracy than conventional state-of-the-art techniques. PMID:24871987

  19. GRID and Multiphonon States

    PubMed Central

    Robinson, S. J.

    2000-01-01

    The development of the GRID technique for determining nuclear level lifetimes of excited low-spin states populated in thermal neutron capture reactions has resulted in the ability to perform detailed studies of proposed multiphonon excitations for the first time. This paper discusses the experimental evidence for multiphonon excitations determined using the GRID technique. In deformed nuclei several good examples of γγKπ = 4+ excitations have been established, whereas the experimental evidence gathered on Kπ= 0+ bands is contradictory, and any interpretations will likely involve the mixing of several different configurations. In vibrational nuclei the GRID technique has helped to establish the existence of multiple quadrupole phonon excitations in 114Cd, and an almost complete set of quadrupole-octupole coupled states in 144Nd. PMID:27551594

  20. A Sub-filter Scale Noise Equation far Hybrid LES Simulations

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2006-01-01

    Hybrid LES/subscale modeling approaches have an important advantage over the current noise prediction methods in that they only involve modeling of the relatively universal subscale motion and not the configuration dependent larger scale turbulence . Previous hybrid approaches use approximate statistical techniques or extrapolation methods to obtain the requisite information about the sub-filter scale motion. An alternative approach would be to adopt the modeling techniques used in the current noise prediction methods and determine the unknown stresses from experimental data. The present paper derives an equation for predicting the sub scale sound from information that can be obtained with currently available experimental procedures. The resulting prediction method would then be intermediate between the current noise prediction codes and previously proposed hybrid techniques.

  1. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, George

    1993-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.

  2. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  3. A hybrid nonlinear programming method for design optimization

    NASA Technical Reports Server (NTRS)

    Rajan, S. D.

    1986-01-01

    Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.

  4. Testing single point incremental forming moulds for rotomoulding operations

    NASA Astrophysics Data System (ADS)

    Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo

    2017-10-01

    Low pressure polymer processes as thermoforming or rotational moulding use much simpler moulds than high pressure processes like injection. However, despite the low forces involved in the process, moulds manufacturing for these applications is still a very material, energy and time consuming operation. Particularly in rotational moulding there is no standard for the mould manufacture and very different techniques are applicable. The goal of this research is to develop and validate a method for manufacturing plastically formed sheet metal moulds by single point incremental forming (SPIF) for rotomoulding and rotocasting operations. A Stewart platform based SPIF machine allow the forming of thick metal sheets, granting the required structural stiffness for the mould surface, and keeping a short manufacture lead time and low thermal inertia. The experimental work involves the proposal of a hollow part, design and fabrication of a sheet metal mould using dieless incremental forming techniques and testing its operation in the production of prototype parts.

  5. A novel technique for ventral orbital stabilization: the masseter muscle flap.

    PubMed

    Sivagurunathan, Amilan; Boy, Sonja C; Steenkamp, Gerhard

    2014-01-01

    Loss of the caudal maxilla and ventral orbit after tumor resections can have negative functional and esthetic influences on the eye involved. This article reports on a case of a caudal maxillary acanthomatous ameloblastoma involving the ventral orbit that was resected and stabilized with a masseter muscle flap. The masseter muscle flap was generated from the superficial belly of the masseter muscle in order to close a defect in the orbital rim, created by a caudal maxillectomy. None of the published complications such as enophthalmos, excessive lacrimation, globe deviation, or strabismus were noted, 8 months following the procedure. The only clinical sign present at the time of re-evaluation was mild lacrimation. The authors propose the use of a masseter muscle flap as a viable technique in stabilizing the ventral orbit after caudal maxillectomy and ventral orbitectomy, preventing the complications associated with this surgery. © 2013 American College of Veterinary Ophthalmologists.

  6. Automatic three-dimensional measurement of large-scale structure based on vision metrology.

    PubMed

    Zhu, Zhaokun; Guan, Banglei; Zhang, Xiaohu; Li, Daokui; Yu, Qifeng

    2014-01-01

    All relevant key techniques involved in photogrammetric vision metrology for fully automatic 3D measurement of large-scale structure are studied. A new kind of coded target consisting of circular retroreflective discs is designed, and corresponding detection and recognition algorithms based on blob detection and clustering are presented. Then a three-stage strategy starting with view clustering is proposed to achieve automatic network orientation. As for matching of noncoded targets, the concept of matching path is proposed, and matches for each noncoded target are found by determination of the optimal matching path, based on a novel voting strategy, among all possible ones. Experiments on a fixed keel of airship have been conducted to verify the effectiveness and measuring accuracy of the proposed methods.

  7. Spectral-Spatial Classification of Hyperspectral Images Using Hierarchical Optimization

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2011-01-01

    A new spectral-spatial method for hyperspectral data classification is proposed. For a given hyperspectral image, probabilistic pixelwise classification is first applied. Then, hierarchical step-wise optimization algorithm is performed, by iteratively merging neighboring regions with the smallest Dissimilarity Criterion (DC) and recomputing class labels for new regions. The DC is computed by comparing region mean vectors, class labels and a number of pixels in the two regions under consideration. The algorithm is converged when all the pixels get involved in the region merging procedure. Experimental results are presented on two remote sensing hyperspectral images acquired by the AVIRIS and ROSIS sensors. The proposed approach improves classification accuracies and provides maps with more homogeneous regions, when compared to previously proposed classification techniques.

  8. Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems

    NASA Astrophysics Data System (ADS)

    Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn

    The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.

  9. Patient-specific bronchoscopy visualization through BRDF estimation and disocclusion correction.

    PubMed

    Chung, Adrian J; Deligianni, Fani; Shah, Pallav; Wells, Athol; Yang, Guang-Zhong

    2006-04-01

    This paper presents an image-based method for virtual bronchoscope with photo-realistic rendering. The technique is based on recovering bidirectional reflectance distribution function (BRDF) parameters in an environment where the choice of viewing positions, directions, and illumination conditions are restricted. Video images of bronchoscopy examinations are combined with patient-specific three-dimensional (3-D) computed tomography data through two-dimensional (2-D)/3-D registration and shading model parameters are then recovered by exploiting the restricted lighting configurations imposed by the bronchoscope. With the proposed technique, the recovered BRDF is used to predict the expected shading intensity, allowing a texture map independent of lighting conditions to be extracted from each video frame. To correct for disocclusion artefacts, statistical texture synthesis was used to recreate the missing areas. New views not present in the original bronchoscopy video are rendered by evaluating the BRDF with different viewing and illumination parameters. This allows free navigation of the acquired 3-D model with enhanced photo-realism. To assess the practical value of the proposed technique, a detailed visual scoring that involves both real and rendered bronchoscope images is conducted.

  10. Efficient Feature Selection and Classification of Protein Sequence Data in Bioinformatics

    PubMed Central

    Faye, Ibrahima; Samir, Brahim Belhaouari; Md Said, Abas

    2014-01-01

    Bioinformatics has been an emerging area of research for the last three decades. The ultimate aims of bioinformatics were to store and manage the biological data, and develop and analyze computational tools to enhance their understanding. The size of data accumulated under various sequencing projects is increasing exponentially, which presents difficulties for the experimental methods. To reduce the gap between newly sequenced protein and proteins with known functions, many computational techniques involving classification and clustering algorithms were proposed in the past. The classification of protein sequences into existing superfamilies is helpful in predicting the structure and function of large amount of newly discovered proteins. The existing classification results are unsatisfactory due to a huge size of features obtained through various feature encoding methods. In this work, a statistical metric-based feature selection technique has been proposed in order to reduce the size of the extracted feature vector. The proposed method of protein classification shows significant improvement in terms of performance measure metrics: accuracy, sensitivity, specificity, recall, F-measure, and so forth. PMID:25045727

  11. Focusing optical waves with a rotationally symmetric sharp-edge aperture

    NASA Astrophysics Data System (ADS)

    Hu, Yanwen; Fu, Shenhe; Li, Zhen; Yin, Hao; Zhou, Jianying; Chen, Zhenqiang

    2018-04-01

    While there has been various kinds of patterned structures proposed for wave focusing, these patterned structures usually involve complicated lithographic techniques since the element size of the patterned structures should be precisely controlled in microscale or even nanoscale. Here we propose a new and straightforward method for focusing an optical plane wave in free space with a rotationally symmetric sharp-edge aperture. The focusing phenomenon of wave is realized by superposition of a portion of the higher-order symmetric plane waves generated from the sharp edges of the apertures, in contrast to previously focusing techniques which usually depend on a curved phase. We demonstrate both experimentally and theoretically the focusing effect with a series of apertures having different rotational symmetry, and find that the intensity of the hotspots could be controlled by the symmetric strength of the sharp-edge apertures. The presented results would advance the conventional wisdom that light would diffract in all directions and become expanding when it propagates through an aperture. The proposed method is easy to be processed, and might open potential applications in interferometry, image, and superresolution.

  12. Design of a Programmable Gain, Temperature Compensated Current-Input Current-Output CMOS Logarithmic Amplifier.

    PubMed

    Ming Gu; Chakrabartty, Shantanu

    2014-06-01

    This paper presents the design of a programmable gain, temperature compensated, current-mode CMOS logarithmic amplifier that can be used for biomedical signal processing. Unlike conventional logarithmic amplifiers that use a transimpedance technique to generate a voltage signal as a logarithmic function of the input current, the proposed approach directly produces a current output as a logarithmic function of the input current. Also, unlike a conventional transimpedance amplifier the gain of the proposed logarithmic amplifier can be programmed using floating-gate trimming circuits. The synthesis of the proposed circuit is based on the Hart's extended translinear principle which involves embedding a floating-voltage source and a linear resistive element within a translinear loop. Temperature compensation is then achieved using a translinear-based resistive cancelation technique. Measured results from prototypes fabricated in a 0.5 μm CMOS process show that the amplifier has an input dynamic range of 120 dB and a temperature sensitivity of 230 ppm/°C (27 °C- 57°C), while consuming less than 100 nW of power.

  13. Signal Processing and Interpretation Using Multilevel Signal Abstractions.

    DTIC Science & Technology

    1986-06-01

    mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves

  14. Solar-terrestrial research for the 1980's

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The solar-terrestrial system is described. Techniques for observations involving all relevant platforms: spacecraft, the Earth's surface, aircraft, balloons, and rockets are proposed. The need for interagency coordination of programs, efficient data management, theoretical studies and modeling, the continuity of long time series observations, and innovative instrument design is emphasized. Examples of the practical impact of interactions between solar terrestrial phenomena and the environment, including technological systems are presented.

  15. Fractional Programming for Communication Systems—Part I: Power Control and Beamforming

    NASA Astrophysics Data System (ADS)

    Shen, Kaiming; Yu, Wei

    2018-05-01

    This two-part paper explores the use of FP in the design and optimization of communication systems. Part I of this paper focuses on FP theory and on solving continuous problems. The main theoretical contribution is a novel quadratic transform technique for tackling the multiple-ratio concave-convex FP problem--in contrast to conventional FP techniques that mostly can only deal with the single-ratio or the max-min-ratio case. Multiple-ratio FP problems are important for the optimization of communication networks, because system-level design often involves multiple signal-to-interference-plus-noise ratio terms. This paper considers the applications of FP to solving continuous problems in communication system design, particularly for power control, beamforming, and energy efficiency maximization. These application cases illustrate that the proposed quadratic transform can greatly facilitate the optimization involving ratios by recasting the original nonconvex problem as a sequence of convex problems. This FP-based problem reformulation gives rise to an efficient iterative optimization algorithm with provable convergence to a stationary point. The paper further demonstrates close connections between the proposed FP approach and other well-known algorithms in the literature, such as the fixed-point iteration and the weighted minimum mean-square-error beamforming. The optimization of discrete problems is discussed in Part II of this paper.

  16. Optical flow and driver's kinematics analysis for state of alert sensing.

    PubMed

    Jiménez-Pinto, Javier; Torres-Torriti, Miguel

    2013-03-28

    Road accident statistics from different countries show that a significant number of accidents occur due to driver's fatigue and lack of awareness to traffic conditions. In particular, about 60% of the accidents in which long haul truck and bus drivers are involved are attributed to drowsiness and fatigue. It is thus fundamental to improve non-invasive systems for sensing a driver's state of alert. One of the main challenges to correctly resolve the state of alert is measuring the percentage of eyelid closure over time (PERCLOS), despite the driver's head and body movements. In this paper, we propose a technique that involves optical flow and driver's kinematics analysis to improve the robustness of the driver's alert state measurement under pose changes using a single camera with near-infrared illumination. The proposed approach infers and keeps track of the driver's pose in 3D space in order to ensure that eyes can be located correctly, even after periods of partial occlusion, for example, when the driver stares away from the camera. Our experiments show the effectiveness of the approach with a correct eyes detection rate of 99.41%, on average. The results obtained with the proposed approach in an experiment involving fifteen persons under different levels of sleep deprivation also confirm the discriminability of the fatigue levels. In addition to the measurement of fatigue and drowsiness, the pose tracking capability of the proposed approach has potential applications in distraction assessment and alerting of machine operators.

  17. Optical Flow and Driver's Kinematics Analysis for State of Alert Sensing

    PubMed Central

    Jiménez-Pinto, Javier; Torres-Torriti, Miguel

    2013-01-01

    Road accident statistics from different countries show that a significant number of accidents occur due to driver's fatigue and lack of awareness to traffic conditions. In particular, about 60% of the accidents in which long haul truck and bus drivers are involved are attributed to drowsiness and fatigue. It is thus fundamental to improve non-invasive systems for sensing a driver's state of alert. One of the main challenges to correctly resolve the state of alert is measuring the percentage of eyelid closure over time (PERCLOS), despite the driver's head and body movements. In this paper, we propose a technique that involves optical flow and driver's kinematics analysis to improve the robustness of the driver's alert state measurement under pose changes using a single camera with near-infrared illumination. The proposed approach infers and keeps track of the driver's pose in 3D space in order to ensure that eyes can be located correctly, even after periods of partial occlusion, for example, when the driver stares away from the camera. Our experiments show the effectiveness of the approach with a correct eyes detection rate of 99.41%, on average. The results obtained with the proposed approach in an experiment involving fifteen persons under different levels of sleep deprivation also confirm the discriminability of the fatigue levels. In addition to the measurement of fatigue and drowsiness, the pose tracking capability of the proposed approach has potential applications in distraction assessment and alerting of machine operators. PMID:23539029

  18. Remote monitoring of environmental particulate pollution - A problem in inversion of first-kind integral equations

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1975-01-01

    The determination of the microstructure, chemical nature, and dynamical evolution of scattering particulates in the atmosphere is considered. A description is given of indirect sampling techniques which can circumvent most of the difficulties associated with direct sampling techniques, taking into account methods based on scattering, extinction, and diffraction of an incident light beam. Approaches for reconstructing the particulate size distribution from the direct and the scattered radiation are discussed. A new method is proposed for determining the chemical composition of the particulates and attention is given to the relevance of methods of solution involving first kind Fredholm integral equations.

  19. Comet composition and density analyzer

    NASA Technical Reports Server (NTRS)

    Clark, B. C.

    1982-01-01

    Distinctions between cometary material and other extraterrestrial materials (meteorite suites and stratospherically-captured cosmic dust) are addressed. The technique of X-ray fluorescence (XRF) for analysis of elemental composition is involved. Concomitant with these investigations, the problem of collecting representative samples of comet dust (for rendezvous missions) was solved, and several related techniques such as mineralogic analysis (X-ray diffraction), direct analysis of the nucleus without docking (electron macroprobe), dust flux rate measurement, and test sample preparation were evaluated. An explicit experiment concept based upon X-ray fluorescence analysis of biased and unbiased sample collections was scoped and proposed for a future rendezvous mission with a short-period comet.

  20. A new approach of watermarking technique by means multichannel wavelet functions

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Puccio, Luigia

    2012-12-01

    The digital piracy involving images, music, movies, books, and so on, is a legal problem that has not found a solution. Therefore it becomes crucial to create and to develop methods and numerical algorithms in order to solve the copyright problems. In this paper we focus the attention on a new approach of watermarking technique applied to digital color images. Our aim is to describe the realized watermarking algorithm based on multichannel wavelet functions with multiplicity r = 3, called MCWM 1.0. We report a large experimentation and some important numerical results in order to show the robustness of the proposed algorithm to geometrical attacks.

  1. Reduced basis technique for evaluating the sensitivity coefficients of the nonlinear tire response

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.

    1992-01-01

    An efficient reduced-basis technique is proposed for calculating the sensitivity of nonlinear tire response to variations in the design variables. The tire is modeled using a 2-D, moderate rotation, laminated anisotropic shell theory, including the effects of variation in material and geometric parameters. The vector of structural response and its first-order and second-order sensitivity coefficients are each expressed as a linear combination of a small number of basis vectors. The effectiveness of the basis vectors used in approximating the sensitivity coefficients is demonstrated by a numerical example involving the Space Shuttle nose-gear tire, which is subjected to uniform inflation pressure.

  2. Validation of catchment models for predicting land-use and climate change impacts. 1. Method

    NASA Astrophysics Data System (ADS)

    Ewen, J.; Parkin, G.

    1996-02-01

    Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).

  3. Passive Optical Locking Techniques for Diode Lasers

    NASA Astrophysics Data System (ADS)

    Zhang, Quan

    1995-01-01

    Most current diode-based nonlinear frequency converters utilize electronic frequency locking techniques. However, this type of locking technique typically involves very complex electronics, and suffers the 'power-drop' problem. This dissertation is devoted to the development of an all-optical passive locking technique that locks the diode laser frequency to the external cavity resonance stably without using any kind of electronic servo. The amplitude noise problem associated with the strong optical locking has been studied. Single-mode operation of a passively locked single-stripe diode with an amplitude stability better than 1% has been achieved. This passive optical locking technique applies to broad-area diodes as well as single-stripe diodes, and can be easily used to generate blue light. A schematic of a milliwatt level blue laser based on the single-stripe diode locking technique has been proposed. A 120 mW 467 nm blue laser has been built using the tapered amplifier locking technique. In addition to diode-based blue lasers, this passive locking technique has applications in nonlinear frequency conversions, resonant spectroscopy, particle counter devices, telecommunications, and medical devices.

  4. Integration of Scale Invariant Generator Technique and S-A Technique for Characterizing 2-D Patterns for Information Retrieve

    NASA Astrophysics Data System (ADS)

    Cao, L.; Cheng, Q.

    2004-12-01

    The scale invariant generator technique (SIG) and spectrum-area analysis technique (S-A) were developed independently relevant to the concept of the generalized scale invariance (GSI). The former was developed for characterizing the parameters involved in the GSI for characterizing and simulating multifractal measures whereas the latter was for identifying scaling breaks for decomposition of superimposed multifractal measures caused by multiple geophysical processes. A natural integration of these two techniques may yield a new technique to serve two purposes, on the one hand, that can enrich the power of S-A by increasing the interpretability of decomposed patterns in some applications of S-A and, on the other hand, that can provide a mean to test the uniqueness of multifractality of measures which is essential for application of SIG technique in more complicated environment. The implementation of the proposed technique has been done as a Dynamic Link Library (DLL) in Visual C++. The program can be friendly used for method validation and application in different fields.

  5. A constrained reconstruction technique of hyperelasticity parameters for breast cancer assessment

    NASA Astrophysics Data System (ADS)

    Mehrabian, Hatef; Campbell, Gordon; Samani, Abbas

    2010-12-01

    In breast elastography, breast tissue usually undergoes large compression resulting in significant geometric and structural changes. This implies that breast elastography is associated with tissue nonlinear behavior. In this study, an elastography technique is presented and an inverse problem formulation is proposed to reconstruct parameters characterizing tissue hyperelasticity. Such parameters can potentially be used for tumor classification. This technique can also have other important clinical applications such as measuring normal tissue hyperelastic parameters in vivo. Such parameters are essential in planning and conducting computer-aided interventional procedures. The proposed parameter reconstruction technique uses a constrained iterative inversion; it can be viewed as an inverse problem. To solve this problem, we used a nonlinear finite element model corresponding to its forward problem. In this research, we applied Veronda-Westmann, Yeoh and polynomial models to model tissue hyperelasticity. To validate the proposed technique, we conducted studies involving numerical and tissue-mimicking phantoms. The numerical phantom consisted of a hemisphere connected to a cylinder, while we constructed the tissue-mimicking phantom from polyvinyl alcohol with freeze-thaw cycles that exhibits nonlinear mechanical behavior. Both phantoms consisted of three types of soft tissues which mimic adipose, fibroglandular tissue and a tumor. The results of the simulations and experiments show feasibility of accurate reconstruction of tumor tissue hyperelastic parameters using the proposed method. In the numerical phantom, all hyperelastic parameters corresponding to the three models were reconstructed with less than 2% error. With the tissue-mimicking phantom, we were able to reconstruct the ratio of the hyperelastic parameters reasonably accurately. Compared to the uniaxial test results, the average error of the ratios of the parameters reconstructed for inclusion to the middle and external layers were 13% and 9.6%, respectively. Given that the parameter ratios of the abnormal tissues to the normal ones range from three times to more than ten times, this accuracy is sufficient for tumor classification.

  6. Optic Nerve Lymphoma. Report of Two Cases and Review of the Literature

    PubMed Central

    Kim, Jennifer L.; Mendoza, Pia; Rashid, Alia; Hayek, Brent; Grossniklaus, Hans E.

    2014-01-01

    Lymphoma may involve the optic nerve as isolated optic nerve lymphoma or in association with CNS or systemic lymphoma. We present two biopsy-proven non-Hodgkin lymphomas of the optic nerve and compare our findings with previously reported cases. We discuss the mechanism of metastasis, classification of optic nerve involvement, clinical features, radiologic findings, optic nerve biopsy indications and techniques, histologic features, and treatments. We propose a classification system of optic nerve lymphoma: isolated optic nerve involvement, optic nerve involvement with CNS disease, optic nerve involvement with systemic disease, and optic nerve involvement with primary intraocular lymphoma. Although it is an uncommon cause of infiltrative optic neuropathy, optic nerve metastasis should be considered in patients with a history of lymphoma. The recommended approach to a patient with presumed optic nerve lymphoma includes neuroimaging, and cerebrospinal fluid evaluation as part of the initial work-up, then judicious use of optic nerve biopsy, depending on the clinical situation. PMID:25595061

  7. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  8. Culture in the Cockpit-CRM in a Multicultural World

    NASA Technical Reports Server (NTRS)

    Engle, Michael

    2000-01-01

    Crew Resource Management (CRM) is fundamentally a method for enhancing personal interactions among crewmembers so that safety and efficiency are increased, and at its core involves issues of culture and social interaction. Since CRM is increasingly being adopted by foreign carriers, it is important to evaluate standard CRM techniques from a cultural standpoint, especially if some of these techniques may be enhanced by adapting them to particular cultures. The purpose of this paper is to propose a model for an ideal CRM culture, and to suggest ways that CRM may be adapted to suit particular cultures. The research method was a simple literature search to gather data on CRM techniques and multicultural crews. The results indicate that CRM can be tailored to specific cultures for maximum effectiveness.

  9. Physiological correlates of mental workload

    NASA Technical Reports Server (NTRS)

    Zacharias, G. L.

    1980-01-01

    A literature review was conducted to assess the basis of and techniques for physiological assessment of mental workload. The study findings reviewed had shortcomings involving one or more of the following basic problems: (1) physiologic arousal can be easily driven by nonworkload factors, confounding any proposed metric; (2) the profound absence of underlying physiologic models has promulgated a multiplicity of seemingly arbitrary signal processing techniques; (3) the unspecified multidimensional nature of physiological "state" has given rise to a broad spectrum of competing noncommensurate metrics; and (4) the lack of an adequate definition of workload compels physiologic correlations to suffer either from the vagueness of implicit workload measures or from the variance of explicit subjective assessments. Using specific studies as examples, two basic signal processing/data reduction techniques in current use, time and ensemble averaging are discussed.

  10. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    NASA Astrophysics Data System (ADS)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  11. A novel technique to solve nonlinear higher-index Hessenberg differential-algebraic equations by Adomian decomposition method.

    PubMed

    Benhammouda, Brahim

    2016-01-01

    Since 1980, the Adomian decomposition method (ADM) has been extensively used as a simple powerful tool that applies directly to solve different kinds of nonlinear equations including functional, differential, integro-differential and algebraic equations. However, for differential-algebraic equations (DAEs) the ADM is applied only in four earlier works. There, the DAEs are first pre-processed by some transformations like index reductions before applying the ADM. The drawback of such transformations is that they can involve complex algorithms, can be computationally expensive and may lead to non-physical solutions. The purpose of this paper is to propose a novel technique that applies the ADM directly to solve a class of nonlinear higher-index Hessenberg DAEs systems efficiently. The main advantage of this technique is that; firstly it avoids complex transformations like index reductions and leads to a simple general algorithm. Secondly, it reduces the computational work by solving only linear algebraic systems with a constant coefficient matrix at each iteration, except for the first iteration where the algebraic system is nonlinear (if the DAE is nonlinear with respect to the algebraic variable). To demonstrate the effectiveness of the proposed technique, we apply it to a nonlinear index-three Hessenberg DAEs system with nonlinear algebraic constraints. This technique is straightforward and can be programmed in Maple or Mathematica to simulate real application problems.

  12. A Review of the Piezoelectric Electromechanical Impedance Based Structural Health Monitoring Technique for Engineering Structures.

    PubMed

    Na, Wongi S; Baek, Jongdae

    2018-04-24

    The birth of smart materials such as piezoelectric (PZT) transducers has aided in revolutionizing the field of structural health monitoring (SHM) based on non-destructive testing (NDT) methods. While a relatively new NDT method known as the electromechanical (EMI) technique has been investigated for more than two decades, there are still various problems that must be solved before it is applied to real structures. The technique, which has a significant potential to contribute to the creation of one of the most effective SHM systems, involves the use of a single PZT for exciting and sensing of the host structure. In this paper, studies applied for the past decade related to the EMI technique have been reviewed to understand its trend. In addition, new concepts and ideas proposed by various authors are also surveyed, and the paper concludes with a discussion of the potential directions for future works.

  13. A Review of the Piezoelectric Electromechanical Impedance Based Structural Health Monitoring Technique for Engineering Structures

    PubMed Central

    Na, Wongi S.; Baek, Jongdae

    2018-01-01

    The birth of smart materials such as piezoelectric (PZT) transducers has aided in revolutionizing the field of structural health monitoring (SHM) based on non-destructive testing (NDT) methods. While a relatively new NDT method known as the electromechanical (EMI) technique has been investigated for more than two decades, there are still various problems that must be solved before it is applied to real structures. The technique, which has a significant potential to contribute to the creation of one of the most effective SHM systems, involves the use of a single PZT for exciting and sensing of the host structure. In this paper, studies applied for the past decade related to the EMI technique have been reviewed to understand its trend. In addition, new concepts and ideas proposed by various authors are also surveyed, and the paper concludes with a discussion of the potential directions for future works. PMID:29695067

  14. Requirements for radiation emergency urine bioassay techniques for the public and first responders.

    PubMed

    Li, Chunsheng; Vlahovich, Slavica; Dai, Xiongxin; Richardson, Richard B; Daka, Joseph N; Kramer, Gary H

    2010-11-01

    Following a radiation emergency, the affected public and the first responders may need to be quickly assessed for internal contamination by the radionuclides involved. Urine bioassay is one of the most commonly used methods for assessing radionuclide intake and radiation dose. This paper attempts to derive the sensitivity requirements (from inhalation exposure) for the urine bioassay techniques for the top 10 high-risk radionuclides that might be used in a terrorist attack. The requirements are based on a proposed reference dose to adults of 0.1 Sv (CED, committed effective dose). In addition, requirements related to sample turnaround time and field deployability of the assay techniques are also discussed. A review of currently available assay techniques summarized in this paper reveals that method development for ²⁴¹Am, ²²⁶Ra, ²³⁸Pu, and ⁹⁰Sr urine bioassay is needed.

  15. Anonymity and Historical-Anonymity in Location-Based Services

    NASA Astrophysics Data System (ADS)

    Bettini, Claudio; Mascetti, Sergio; Wang, X. Sean; Freni, Dario; Jajodia, Sushil

    The problem of protecting user’s privacy in Location-Based Services (LBS) has been extensively studied recently and several defense techniques have been proposed. In this contribution, we first present a categorization of privacy attacks and related defenses. Then, we consider the class of defense techniques that aim at providing privacy through anonymity and in particular algorithms achieving “historical k- anonymity” in the case of the adversary obtaining a trace of requests recognized as being issued by the same (anonymous) user. Finally, we investigate the issues involved in the experimental evaluation of anonymity based defense techniques; we show that user movement simulations based on mostly random movements can lead to overestimate the privacy protection in some cases and to overprotective techniques in other cases. The above results are obtained by comparison to a more realistic simulation with an agent-based simulator, considering a specific deployment scenario.

  16. History of surgical treatments for hallux valgus.

    PubMed

    Galois, Laurent

    2018-05-31

    In the nineteenth century, the prevalent understanding of the hallux valgus was that it was purely an enlargement of the soft tissue, first metatarsal head, or both, most commonly caused by ill-fitting footwear. Thus, treatment had varying results, with controversy over whether to remove the overlying bursa alone or in combination with an exostectomy of the medial head. Since 1871, when the surgical technique was first described, many surgical treatments for the correction of hallux valgus have been proposed. A number of these techniques have come into fashion, and others have fallen into oblivion. Progress in biomechanical knowledge, and improvements in materials and supports have allowed new techniques to be developed over the years. We have developed techniques that sacrifice the metatarsophalangeal joint (arthrodesis, arthroplasties), as well as conservative procedures, and one can distinguish those which only involve the soft tissues from those that are linked with a first ray osteotomy.

  17. Uncluttered Single-Image Visualization of Vascular Structures using GPU and Integer Programming

    PubMed Central

    Won, Joong-Ho; Jeon, Yongkweon; Rosenberg, Jarrett; Yoon, Sungroh; Rubin, Geoffrey D.; Napel, Sandy

    2013-01-01

    Direct projection of three-dimensional branching structures, such as networks of cables, blood vessels, or neurons onto a 2D image creates the illusion of intersecting structural parts and creates challenges for understanding and communication. We present a method for visualizing such structures, and demonstrate its utility in visualizing the abdominal aorta and its branches, whose tomographic images might be obtained by computed tomography or magnetic resonance angiography, in a single two-dimensional stylistic image, without overlaps among branches. The visualization method, termed uncluttered single-image visualization (USIV), involves optimization of geometry. This paper proposes a novel optimization technique that utilizes an interesting connection of the optimization problem regarding USIV to the protein structure prediction problem. Adopting the integer linear programming-based formulation for the protein structure prediction problem, we tested the proposed technique using 30 visualizations produced from five patient scans with representative anatomical variants in the abdominal aortic vessel tree. The novel technique can exploit commodity-level parallelism, enabling use of general-purpose graphics processing unit (GPGPU) technology that yields a significant speedup. Comparison of the results with the other optimization technique previously reported elsewhere suggests that, in most aspects, the quality of the visualization is comparable to that of the previous one, with a significant gain in the computation time of the algorithm. PMID:22291148

  18. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  19. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  20. Effective gene prediction by high resolution frequency estimator based on least-norm solution technique

    PubMed Central

    2014-01-01

    Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895

  1. Lightweight composites for modular panelized construction

    NASA Astrophysics Data System (ADS)

    Vaidya, Amol S.

    Rapid advances in construction materials technology have enabled civil engineers to achieve impressive gains in the safety, economy, and functionality of structures built to serve the common needs of society. Modular building systems is a fast-growing modern, form of construction gaining recognition for its increased efficiency and ability to apply modern technology to the needs of the market place. In the modular construction technique, a single structural panel can perform a number of functions such as providing thermal insulation, vibration damping, and structural strength. These multifunctional panels can be prefabricated in a manufacturing facility and then transferred to the construction site. A system that uses prefabricated panels for construction is called a "panelized construction system". This study focuses on the development of pre-cast, lightweight, multifunctional sandwich composite panels to be used for panelized construction. Two thermoplastic composite panels are proposed in this study, namely Composite Structural Insulated Panels (CSIPs) for exterior walls, floors and roofs, and Open Core Sandwich composite for multifunctional interior walls of a structure. Special manufacturing techniques are developed for manufacturing these panels. The structural behavior of these panels is analyzed based on various building design codes. Detailed descriptions of the design, cost analysis, manufacturing, finite element modeling and structural testing of these proposed panels are included in this study in the of form five peer-reviewed journal articles. The structural testing of the proposed panels involved in this study included flexural testing, axial compression testing, and low and high velocity impact testing. Based on the current study, the proposed CSIP wall and floor panels were found satisfactory, based on building design codes ASCE-7-05 and ACI-318-05. Joining techniques are proposed in this study for connecting the precast panels on the construction site. Keywords: Modular panelized construction, sandwich composites, composite structural insulated panels (CSIPs).

  2. A hybrid artificial bee colony algorithm and pattern search method for inversion of particle size distribution from spectral extinction data

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Feng; Xing, Jian

    2017-10-01

    In this paper, a hybrid artificial bee colony (ABC) algorithm and pattern search (PS) method is proposed and applied for recovery of particle size distribution (PSD) from spectral extinction data. To be more useful and practical, size distribution function is modelled as the general Johnson's ? function that can overcome the difficulty of not knowing the exact type beforehand encountered in many real circumstances. The proposed hybrid algorithm is evaluated through simulated examples involving unimodal, bimodal and trimodal PSDs with different widths and mean particle diameters. For comparison, all examples are additionally validated by the single ABC algorithm. In addition, the performance of the proposed algorithm is further tested by actual extinction measurements with real standard polystyrene samples immersed in water. Simulation and experimental results illustrate that the hybrid algorithm can be used as an effective technique to retrieve the PSDs with high reliability and accuracy. Compared with the single ABC algorithm, our proposed algorithm can produce more accurate and robust inversion results while taking almost comparative CPU time over ABC algorithm alone. The superiority of ABC and PS hybridization strategy in terms of reaching a better balance of estimation accuracy and computation effort increases its potentials as an excellent inversion technique for reliable and efficient actual measurement of PSD.

  3. [Present-day metal-cutting tools and working conditions].

    PubMed

    Kondratiuk, V P

    1990-01-01

    Polyfunctional machine-tools of a processing centre type are characterized by a set of hygienic advantages as compared to universal machine-tools. But low degree of mechanization and automation of some auxiliary processes, and constructional defects which decrease the ergonomic characteristics of the tools, involve labour intensity in multi-machine processing. The article specifies techniques of allowable noise level assessment, and proposes hygienic recommendations, some of which have been introduced into practice.

  4. Neural correlates of cognitive improvements following cognitive remediation in schizophrenia: a systematic review of randomized trials

    PubMed Central

    Isaac, Clémence; Januel, Dominique

    2016-01-01

    Background Cognitive impairments are a core feature in schizophrenia and are linked to poor social functioning. Numerous studies have shown that cognitive remediation can enhance cognitive and functional abilities in patients with this pathology. The underlying mechanism of these behavioral improvements seems to be related to structural and functional changes in the brain. However, studies on neural correlates of such enhancement remain scarce. Objectives We explored the neural correlates of cognitive enhancement following cognitive remediation interventions in schizophrenia and the differential effect between cognitive training and other therapeutic interventions or patients’ usual care. Method We searched MEDLINE, PsycInfo, and ScienceDirect databases for studies on cognitive remediation therapy in schizophrenia that used neuroimaging techniques and a randomized design. Search terms included randomized controlled trial, cognitive remediation, cognitive training, rehabilitation, magnetic resonance imaging, positron emission tomography, electroencephalography, magnetoencephalography, near infrared spectroscopy, and diffusion tensor imaging. We selected randomized controlled trials that proposed multiple sessions of cognitive training to adult patients with a schizophrenia spectrum disorder and assessed its efficacy with imaging techniques. Results In total, 15 reports involving 19 studies were included in the systematic review. They involved a total of 455 adult patients, 271 of whom received cognitive remediation. Cognitive remediation therapy seems to provide a neurobiological enhancing effect in schizophrenia. After therapy, increased activations are observed in various brain regions mainly in frontal – especially prefrontal – and also in occipital and anterior cingulate regions during working memory and executive tasks. Several studies provide evidence of an improved functional connectivity after cognitive training, suggesting a neuroplastic effect of therapy through mechanisms of functional reorganization. Neurocognitive and social-cognitive training may have a cumulative effect on neural networks involved in social cognition. The variety of proposed programs, imaging tasks, and techniques may explain the heterogeneity of observed neural improvements. Future studies would need to specify the effect of cognitive training depending on those variables. PMID:26993787

  5. Effect of Boron Neutron Capture Therapy (BNCT) on Normal Liver Regeneration: Towards a Novel Therapy for Liver Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jorge E. Cardoso; Elisa M. Heber; David W. Nigg

    2007-10-01

    The “TAORMINA project” developed a new method for Boron Neutron Capture Therapy (BNCT) of human multifocal unresectable liver metastases based on whole liver ex-situ BNCT mediated by boronophenylalanine (BPA), followed by whole liver autograft. This technique involved a high risk, prolonged anhepatic phase. The Roffo Institute liver surgeons (JEC) herein propose a novel technique to pursue ex-situ liver BNCT studies with a drastically lower surgical risk for the patient. The technique would involve, sequentially, ex-situ BNCT of left liver segments II and III, partial liver autograft, and induction of partial atrophy of the untreated right liver. The working hypothesis ismore » that the atrophy of the right, untreated, diseased liver would stimulate regeneration of the left, treated, “cured” liver to yield a healthy liver mass, allowing for the resection of the remaining portion of diseased liver. This technique does not involve an anhepatic phase and would thus pose a drastically lower surgical risk to the patient but requires sine qua non that BNCT should not impair the regenerative capacity of normal hepatocytes. The aim of the present study was to assess the effect of therapeutic doses of BNCT mediated by BPA, GB-10 (Na2 10B10H10) or (GB- 10 + BPA) on normal liver regeneration in the Wistar rat employing partial hepatectomy as a regenerative stimulus. BNCT did not cause alterations in the outcome of normal liver regeneration, regenerated liver function or histology. We provide proof of principle to support the development of a novel, promising BNCT technique for the treatment of liver metastases.« less

  6. Defining Malaysian Knowledge Society: Results from the Delphi Technique

    NASA Astrophysics Data System (ADS)

    Hamid, Norsiah Abdul; Zaman, Halimah Badioze

    This paper outlines the findings of research where the central idea is to define the term Knowledge Society (KS) in Malaysian context. The research focuses on three important dimensions, namely knowledge, ICT and human capital. This study adopts a modified Delphi technique to seek the important dimensions that can contribute to the development of Malaysian's KS. The Delphi technique involved ten experts in a five-round iterative and controlled feedback procedure to obtain consensus on the important dimensions and to verify the proposed definition of KS. The finding shows that all three dimensions proposed initially scored high and moderate consensus. Round One (R1) proposed an initial definition of KS and required comments and inputs from the panel. These inputs were then used to develop items for a R2 questionnaire. In R2, 56 out of 73 items scored high consensus and in R3, 63 out of 90 items scored high. R4 was conducted to re-rate the new items, in which 8 out of 17 items scored high. Other items scored moderate consensus and no item scored low or no consensus in all rounds. The final round (R5) was employed to verify the final definition of KS. Findings and discovery of this study are significant to the definition of KS and the development of a framework in the Malaysian context.

  7. An Intelligent Content Discovery Technique for Health Portal Content Management

    PubMed Central

    2014-01-01

    Background Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. Objective This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content Methods A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. Results The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. Conclusions The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current. PMID:25654440

  8. A hybrid technique for speech segregation and classification using a sophisticated deep neural network

    PubMed Central

    Nawaz, Tabassam; Mehmood, Zahid; Rashid, Muhammad; Habib, Hafiz Adnan

    2018-01-01

    Recent research on speech segregation and music fingerprinting has led to improvements in speech segregation and music identification algorithms. Speech and music segregation generally involves the identification of music followed by speech segregation. However, music segregation becomes a challenging task in the presence of noise. This paper proposes a novel method of speech segregation for unlabelled stationary noisy audio signals using the deep belief network (DBN) model. The proposed method successfully segregates a music signal from noisy audio streams. A recurrent neural network (RNN)-based hidden layer segregation model is applied to remove stationary noise. Dictionary-based fisher algorithms are employed for speech classification. The proposed method is tested on three datasets (TIMIT, MIR-1K, and MusicBrainz), and the results indicate the robustness of proposed method for speech segregation. The qualitative and quantitative analysis carried out on three datasets demonstrate the efficiency of the proposed method compared to the state-of-the-art speech segregation and classification-based methods. PMID:29558485

  9. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    PubMed Central

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  10. Quantitative measurement of binary liquid distributions using multiple-tracer x-ray fluorescence and radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.

    2015-01-01

    The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less

  11. A mechanistic review on vermifiltration of wastewater: Design, operation and performance.

    PubMed

    Singh, Rajneesh; Bhunia, Puspendu; Dash, Rajesh R

    2017-07-15

    With global population explosion, the available water resources are slowly being polluted due to the excessive human interference. To encounter this, it is the need of this hour to find out sustainable pollution remediating technologies to meet the stringent discharge standards for domestic as well as industrial wastewaters. In addition, those techniques should have the capabilities for effective implementation even in developing countries. Based on the available literatures, one such technique, named vermifilter, has been identified which takes care of almost all the sustainable and economical criteria for its effective implementation even in developing countries. The aim of this meta-analysis is to provide a comprehensive review on assessment mechanisms involved, factors affecting the process and performance of vermifiltration under different scenarios. The present review envisages the current state of the knowledge regarding physical, chemical and biological aspects related to the treatment mechanisms and effective functioning of earthworms. This review has also proposed several suggestive plans on its application at any proposed site. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    PubMed Central

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  13. Geometric data perturbation-based personal health record transactions in cloud computing.

    PubMed

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  14. An image segmentation method based on fuzzy C-means clustering and Cuckoo search algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Mingwei; Wan, Youchuan; Gao, Xianjun; Ye, Zhiwei; Chen, Maolin

    2018-04-01

    Image segmentation is a significant step in image analysis and machine vision. Many approaches have been presented in this topic; among them, fuzzy C-means (FCM) clustering is one of the most widely used methods for its high efficiency and ambiguity of images. However, the success of FCM could not be guaranteed because it easily traps into local optimal solution. Cuckoo search (CS) is a novel evolutionary algorithm, which has been tested on some optimization problems and proved to be high-efficiency. Therefore, a new segmentation technique using FCM and blending of CS algorithm is put forward in the paper. Further, the proposed method has been measured on several images and compared with other existing FCM techniques such as genetic algorithm (GA) based FCM and particle swarm optimization (PSO) based FCM in terms of fitness value. Experimental results indicate that the proposed method is robust, adaptive and exhibits the better performance than other methods involved in the paper.

  15. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    PubMed

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  16. Hardware Implementation of a MIMO Decoder Using Matrix Factorization Based Channel Estimation

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Numan, Mostafa Wasiuddin; Misran, Norbahiah; Ali, Mohd Alauddin Mohd; Singh, Mandeep

    2011-05-01

    This paper presents an efficient hardware realization of multiple-input multiple-output (MIMO) wireless communication decoder that utilizes the available resources by adopting the technique of parallelism. The hardware is designed and implemented on Xilinx Virtex™-4 XC4VLX60 field programmable gate arrays (FPGA) device in a modular approach which simplifies and eases hardware update, and facilitates testing of the various modules independently. The decoder involves a proficient channel estimation module that employs matrix factorization on least squares (LS) estimation to reduce a full rank matrix into a simpler form in order to eliminate matrix inversion. This results in performance improvement and complexity reduction of the MIMO system. Performance evaluation of the proposed method is validated through MATLAB simulations which indicate 2 dB improvement in terms of SNR compared to LS estimation. Moreover complexity comparison is performed in terms of mathematical operations, which shows that the proposed approach appreciably outperforms LS estimation at a lower complexity and represents a good solution for channel estimation technique.

  17. Tracking Organs Composed of One or Multiple Regions Using Geodesic Active Region Models

    NASA Astrophysics Data System (ADS)

    Martínez, A.; Jiménez, J. J.

    In radiotherapy treatment it is very important to find out the target organs on the medical image sequence in order to determine and apply the proper dose. The techniques to achieve this goal can be classified into extrinsic and intrinsic. Intrinsic techniques only use image processing with medical images associated to the radiotherapy treatment, as we deal in this chapter. To accurately perform this organ tracking it is necessary to find out segmentation and tracking models that were able to be applied to several image modalities involved on a radiotherapy session (CT See Modality , MRI , etc.). The movements of the organs are mainly affected by two factors: breathing and involuntary movements associated with the internal organs or patient positioning. Among the several alternatives to track the organs of interest, a model based on geodesic active regions is proposed. This model has been tested over CT images from the pelvic, cardiac, and thoracic area. A new model for the segmentation of organs composed by more than one region is proposed.

  18. Energy prediction using spatiotemporal pattern networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhanhong; Liu, Chao; Akintayo, Adedotun

    This paper presents a novel data-driven technique based on the spatiotemporal pattern network (STPN) for energy/power prediction for complex dynamical systems. Built on symbolic dynamical filtering, the STPN framework is used to capture not only the individual system characteristics but also the pair-wise causal dependencies among different sub-systems. To quantify causal dependencies, a mutual information based metric is presented and an energy prediction approach is subsequently proposed based on the STPN framework. To validate the proposed scheme, two case studies are presented, one involving wind turbine power prediction (supply side energy) using the Western Wind Integration data set generated bymore » the National Renewable Energy Laboratory (NREL) for identifying spatiotemporal characteristics, and the other, residential electric energy disaggregation (demand side energy) using the Building America 2010 data set from NREL for exploring temporal features. In the energy disaggregation context, convex programming techniques beyond the STPN framework are developed and applied to achieve improved disaggregation performance.« less

  19. Quantitative measurement of binary liquid distributions using multiple-tracer x-ray fluorescence and radiography

    DOE PAGES

    Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.

    2015-01-23

    The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less

  20. Fabrication of strain gauge based sensors for tactile skins

    NASA Astrophysics Data System (ADS)

    Baptist, Joshua R.; Zhang, Ruoshi; Wei, Danming; Saadatzi, Mohammad Nasser; Popa, Dan O.

    2017-05-01

    Fabricating cost effective, reliable and functional sensors for electronic skins has been a challenging undertaking for the last several decades. Application of such skins include haptic interfaces, robotic manipulation, and physical human-robot interaction. Much of our recent work has focused on producing compliant sensors that can be easily formed around objects to sense normal, tension, or shear forces. Our past designs have involved the use of flexible sensors and interconnects fabricated on Kapton substrates, and piezoresistive inks that are 3D printed using Electro Hydro Dynamic (EHD) jetting onto interdigitated electrode (IDE) structures. However, EHD print heads require a specialized nozzle and the application of a high-voltage electric field; for which, tuning process parameters can be difficult based on the choice of inks and substrates. Therefore, in this paper we explore sensor fabrication techniques using a novel wet lift-off photolithographic technique for patterning the base polymer piezoresistive material, specifically Poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) or PEDOT:PSS. Fabricated sensors are electrically and thermally characterized, and temperaturecompensated designs are proposed and validated. Packaging techniques for sensors in polymer encapsulants are proposed and demonstrated to produce a tactile interface device for a robot.

  1. Analysis and synthesis of laughter

    NASA Astrophysics Data System (ADS)

    Sundaram, Shiva; Narayanan, Shrikanth

    2004-10-01

    There is much enthusiasm in the text-to-speech community for synthesis of emotional and natural speech. One idea being proposed is to include emotion dependent paralinguistic cues during synthesis to convey emotions effectively. This requires modeling and synthesis techniques of various cues for different emotions. Motivated by this, a technique to synthesize human laughter is proposed. Laughter is a complex mechanism of expression and has high variability in terms of types and usage in human-human communication. People have their own characteristic way of laughing. Laughter can be seen as a controlled/uncontrolled physiological process of a person resulting from an initial excitation in context. A parametric model based on damped simple harmonic motion to effectively capture these diversities and also maintain the individuals characteristics is developed here. Limited laughter/speech data from actual humans and synthesis ease are the constraints imposed on the accuracy of the model. Analysis techniques are also developed to determine the parameters of the model for a given individual or laughter type. Finally, the effectiveness of the model to capture the individual characteristics and naturalness compared to real human laughter has been analyzed. Through this the factors involved in individual human laughter and their importance can be better understood.

  2. "Enteroatmospheric fistulae"--gastrointestinal openings in the open abdomen: a review and recent proposal of a surgical technique.

    PubMed

    Marinis, A; Gkiokas, G; Argyra, E; Fragulidis, G; Polymeneas, G; Voros, D

    2013-01-01

    The occurrence of an enteric fistula in the middle of an open abdomen is called an enteroatmospheric fistula, which is the most challenging and feared complication for a surgeon to deal with. It is in fact not a true fistula because it neither has a fistula tract nor is covered by a well-vascularized tissue. The mortality of enteroatmospheric fistulae was as high as 70% in past decades but is currently approximately 40% due to advanced modern intensive care and improved surgical techniques. Management of patients with an open abdomen and an enteroatmospheric fistula is very challenging. Intensive care support of organs and systems is vital in order to manage the severely septic patient and the associated multiple organ failure syndrome. Many of the principles applied to classic enterocutaneous fistulae are used as well. Control of enteric spillage, attempts to seal the fistula, and techniques of peritoneal access for excision of the involved loop are reviewed in this report. Additionally, we describe our recent proposal of a lateral surgical approach via the circumference of the open abdomen in order to avoid the hostile and granulated surface of the abdominal trauma, which is adhered to the intraperitoneal organs.

  3. Elastic issues and vibration reduction in a tethered deorbiting mission

    NASA Astrophysics Data System (ADS)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.

    2016-05-01

    Recently proposed mission concepts involving harpoons or nets to capture and de-orbit debris represent an interesting application of the tethered systems, where the orbiting bodies are connected by a flexible link. These systems present a complex behavior, as flexible characteristics combine with orbital dynamics. The focus of the paper is on the dynamic behavior of the tethered system in the final phase of the de-orbiting mission, when a powerful apogee motor is used to change the debris orbit. The thrust action introduces significant issues, as elastic waves propagate along the tether, and the relevant oscillations couple with the orbital dynamics. Input shaping techniques are proposed to limit or cancel these oscillations. However, the performance of these techniques drops when non-ideal scenarios are considered. In particular, an initially slack tether is a serious issue that must be solved if acceptably low oscillations of the tether are to be obtained. Three strategies are proposed and discussed in this paper to remove the slack condition: a natural drift of the chaser by means of a single impulse, a controlled maneuver for precisely adjusting the relative distance between chaser spacecraft and debris, and a retrieval mechanism for changing the tether length.

  4. Unlocking hidden genomic sequence

    PubMed Central

    Keith, Jonathan M.; Cochran, Duncan A. E.; Lala, Gita H.; Adams, Peter; Bryant, Darryn; Mitchelson, Keith R.

    2004-01-01

    Despite the success of conventional Sanger sequencing, significant regions of many genomes still present major obstacles to sequencing. Here we propose a novel approach with the potential to alleviate a wide range of sequencing difficulties. The technique involves extracting target DNA sequence from variants generated by introduction of random mutations. The introduction of mutations does not destroy original sequence information, but distributes it amongst multiple variants. Some of these variants lack problematic features of the target and are more amenable to conventional sequencing. The technique has been successfully demonstrated with mutation levels up to an average 18% base substitution and has been used to read previously intractable poly(A), AT-rich and GC-rich motifs. PMID:14973330

  5. Employing wavelet-based texture features in ammunition classification

    NASA Astrophysics Data System (ADS)

    Borzino, Ángelo M. C. R.; Maher, Robert C.; Apolinário, José A.; de Campos, Marcello L. R.

    2017-05-01

    Pattern recognition, a branch of machine learning, involves classification of information in images, sounds, and other digital representations. This paper uses pattern recognition to identify which kind of ammunition was used when a bullet was fired based on a carefully constructed set of gunshot sound recordings. To do this task, we show that texture features obtained from the wavelet transform of a component of the gunshot signal, treated as an image, and quantized in gray levels, are good ammunition discriminators. We test the technique with eight different calibers and achieve a classification rate better than 95%. We also compare the performance of the proposed method with results obtained by standard temporal and spectrographic techniques

  6. Water sprays in space retrieval operations. [for disabled spacecraft detumbling and despinning

    NASA Technical Reports Server (NTRS)

    Freesland, D. C.

    1978-01-01

    The water spray technique (WST) for nullifying the angular momentum of a disabled spacecraft is examined. Such a despinning operation is necessary before a disabled spacecraft can be retrieved by the Space Shuttle. The WST involving the use of liquid sprays appears to be less complex and costly than other techniques proposed to despin a disabled vehicle. A series of experiments have been conducted to determine physical properties of water sprays exhausting into a vacuum. A computer model is built which together with the experimental results yields satellite despin performance parameters. The selection and retrieval of an actual disabled spacecraft is considered to demonstrate an application of the WST.

  7. Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts

    NASA Astrophysics Data System (ADS)

    hong, Zhou; Wenhua, Lu

    2017-01-01

    Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.

  8. A new detection scheme for ultrafast 2D J-resolved spectroscopy

    NASA Astrophysics Data System (ADS)

    Giraudeau, Patrick; Akoka, Serge

    2007-06-01

    Recent ultrafast techniques enable 2D NMR spectra to be obtained in a single scan. A modification of the detection scheme involved in this technique is proposed, permitting the achievement of 2D 1H J-resolved spectra in 500 ms. The detection gradient echoes are substituted by spin echoes to obtain spectra where the coupling constants are encoded along the direct ν2 domain. The use of this new J-resolved detection block after continuous phase-encoding excitation schemes is discussed in terms of resolution and sensitivity. J-resolved spectra obtained on cinnamic acid and 3-ethyl bromopropionate are presented, revealing the expected 2D J-patterns with coupling constants as small as 2 Hz.

  9. In-vivo determination of chewing patterns using FBG and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Pegorini, Vinicius; Zen Karam, Leandro; Rocha Pitta, Christiano S.; Ribeiro, Richardson; Simioni Assmann, Tangriani; Cardozo da Silva, Jean Carlos; Bertotti, Fábio L.; Kalinowski, Hypolito J.; Cardoso, Rafael

    2015-09-01

    This paper reports the process of pattern classification of the chewing process of ruminants. We propose a simplified signal processing scheme for optical fiber Bragg grating (FBG) sensors based on machine learning techniques. The FBG sensors measure the biomechanical forces during jaw movements and an artificial neural network is responsible for the classification of the associated chewing pattern. In this study, three patterns associated to dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior studies were monitored, rumination and idle period. Experimental results show that the proposed approach for pattern classification has been capable of differentiating the materials involved in the chewing process with a small classification error.

  10. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  11. A Personal View of How Paleomicrobiology Aids Our Understanding of the Role of Lice in Plague Pandemics.

    PubMed

    Raoult, Didier

    2016-08-01

    We have been involved in the field of paleomicrobiology since 1998, when we used dental pulp to identify Yersinia pestis as the causative agent of the great plague of Marseille (1720). We recently designed a specific technique, "suicide PCR," that can prevent contamination. A controversy arose between two teams, with one claiming that DNA must be altered to amplify it and the other group claiming that demographic data did not support the role of Y. pestis in the Black Death (i.e., the great plague of the Middle Ages). These controversies led us to evaluate other epidemiological models and to propose the body louse as the vector of this pandemic. This proposal was substantiated by experimental models, the recovery of Y. pestis from lice in the Congo, and the identification of epidemics involving both Y. pestis and Bartonella quintana (the agent of trench fever, transmitted by the body louse) in ancient corpses from mass graves. Paleomicrobiology has led to a re-evaluation of plague pandemics.

  12. Combined stamping-forging for non-axisymmetric product

    NASA Astrophysics Data System (ADS)

    Taureza, Muhammad; Danno, Atsushi; Song, Xu; Oh, Jin An

    2016-10-01

    Successive combined stamping-forging (CSF) is proposed to produce multi-thickness non-axisymmetric components. This method involves successive compression to create exclusively outward metal flow. Hitherto, the development of CSF has been mostly done for axisymmetric geometry. Using this technique, defect-free rectangular case component with length to thickness ratio of 40 is produced with lower forging pressure. This technology has potential for high throughput production of parts with multiple thicknesses and high width to thickness ratio.

  13. Estimation and tracking of AP-diameter of the inferior vena cava in ultrasound images using a novel active circle algorithm.

    PubMed

    Karami, Ebrahim; Shehata, Mohamed S; Smith, Andrew

    2018-05-04

    Medical research suggests that the anterior-posterior (AP)-diameter of the inferior vena cava (IVC) and its associated temporal variation as imaged by bedside ultrasound is useful in guiding fluid resuscitation of the critically-ill patient. Unfortunately, indistinct edges and gaps in vessel walls are frequently present which impede accurate estimation of the IVC AP-diameter for both human operators and segmentation algorithms. The majority of research involving use of the IVC to guide fluid resuscitation involves manual measurement of the maximum and minimum AP-diameter as it varies over time. This effort proposes using a time-varying circle fitted inside the typically ellipsoid IVC as an efficient, consistent and novel approach to tracking and approximating the AP-diameter even in the context of poor image quality. In this active-circle algorithm, a novel evolution functional is proposed and shown to be a useful tool for ultrasound image processing. The proposed algorithm is compared with an expert manual measurement, and state-of-the-art relevant algorithms. It is shown that the algorithm outperforms other techniques and performs very close to manual measurement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Right ventricle myectomy.

    PubMed

    Borisov, Konstantin V

    2017-07-01

    Right ventricular (RV) hypertrophy is common in patients with hypertrophic cardiomyopathy (HCM), and is associated with more severe disease. Conventional surgical strategies such as the traditional Morrow procedure pose a particularly high risk to patients with severe hypertrophy and RV obstruction, for whom the most appropriate therapeutic approach has not yet been established. We have proposed a new technique for surgical correction in patients with hypertrophic obstructive cardiomyopathy and severe hypertrophy, which involves approaching the area of obstruction by entering through the conal part of the RV. This novel technique provides effective elimination of biventricular obstruction and the precise removal of the areas of septal fibrosis in patients with hypertrophic obstructive cardiomyopathy. The current literature review analyzes the indications and various techniques for performing a RV myectomy, and presents the results of follow-up assessments in patients with biventricular obstruction and severe hypertrophy.

  15. Correlative fractography: combining scanning electron microscopy and light microscopes for qualitative and quantitative analysis of fracture surfaces.

    PubMed

    Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato

    2013-04-01

    Correlative fractography is a new expression proposed here to describe a new method for the association between scanning electron microscopy (SEM) and light microscopy (LM) for the qualitative and quantitative analysis of fracture surfaces. This article presents a new method involving the fusion of one elevation map obtained by extended depth from focus reconstruction from LM with exactly the same area by SEM and associated techniques, as X-ray mapping. The true topographic information is perfectly associated to local fracture mechanisms with this new technique, presented here as an alternative to stereo-pair reconstruction for the investigation of fractured components. The great advantage of this technique resides in the possibility of combining any imaging methods associated with LM and SEM for the same observed field from fracture surface.

  16. Optimal steering for kinematic vehicles with applications to spatially distributed agents

    NASA Astrophysics Data System (ADS)

    Brown, Scott; Praeger, Cheryl E.; Giudici, Michael

    While there is no universal method to address control problems involving networks of autonomous vehicles, there exist a few promising schemes that apply to different specific classes of problems, which have attracted the attention of many researchers from different fields. In particular, one way to extend techniques that address problems involving a single autonomous vehicle to those involving teams of autonomous vehicles is to use the concept of Voronoi diagram. The Voronoi diagram provides a spatial partition of the environment the team of vehicles operate in, where each element of this partition is associated with a unique vehicle from the team. The partition induces a graph abstraction of the operating space that is in an one-to-one correspondence with the network abstraction of the team of autonomous vehicles; a fact that can provide both conceptual and analytical advantages during mission planning and execution. In this dissertation, we propose the use of a new class of Voronoi-like partitioning schemes with respect to state-dependent proximity (pseudo-) metrics rather than the Euclidean distance or other generalized distance functions, which are typically used in the literature. An important nuance here is that, in contrast to the Euclidean distance, state-dependent metrics can succinctly capture system theoretic features of each vehicle from the team (e.g., vehicle kinematics), as well as the environment-vehicle interactions, which are induced, for example, by local winds/currents. We subsequently illustrate how the proposed concept of state-dependent Voronoi-like partition can induce local control schemes for problems involving networks of spatially distributed autonomous vehicles by examining a sequential pursuit problem of a maneuvering target by a group of pursuers distributed in the plane. The construction of generalized Voronoi diagrams with respect to state-dependent metrics poses some significant challenges. First, the generalized distance metric may be a function of the direction of motion of the vehicle (anisotropic pseudo-distance function) and/or may not be expressible in closed form. Second, such problems fall under the general class of partitioning problems for which the vehicles' dynamics must be taken into account. The topology of the vehicle's configuration space may be non-Euclidean, for example, it may be a manifold embedded in a Euclidean space. In other words, these problems may not be reducible to generalized Voronoi diagram problems for which efficient construction schemes, analytical and/or computational, exist in the literature. This research effort pursues three main objectives. First, we present the complete solution of different steering problems involving a single vehicle in the presence of motion constraints imposed by the maneuverability envelope of the vehicle and/or the presence of a drift field induced by winds/currents in its vicinity. The analysis of each steering problem involving a single vehicle provides us with a state-dependent generalized metric, such as the minimum time-to-go/come. We subsequently use these state-dependent generalized distance functions as the proximity metrics in the formulation of generalized Voronoi-like partitioning problems. The characterization of the solutions of these state-dependent Voronoi-like partitioning problems using either analytical or computational techniques constitutes the second main objective of this dissertation. The third objective of this research effort is to illustrate the use of the proposed concept of state-dependent Voronoi-like partition as a means for passing from control techniques that apply to problems involving a single vehicle to problems involving networks of spatially distributed autonomous vehicles. To this aim, we formulate the problem of sequential/relay pursuit of a maneuvering target by a group of spatially distributed pursuers and subsequently propose a distributed group pursuit strategy that directly derives from the solution of a state-dependent Voronoi-like partitioning problem. (Abstract shortened by UMI.)

  17. Content-Based Image Retrieval System for Pulmonary Nodules: Assisting Radiologists in Self-Learning and Diagnosis of Lung Cancer.

    PubMed

    Dhara, Ashis Kumar; Mukhopadhyay, Sudipta; Dutta, Anirvan; Garg, Mandeep; Khandelwal, Niranjan

    2017-02-01

    Visual information of similar nodules could assist the budding radiologists in self-learning. This paper presents a content-based image retrieval (CBIR) system for pulmonary nodules, observed in lung CT images. The reported CBIR systems of pulmonary nodules cannot be put into practice as radiologists need to draw the boundary of nodules during query formation and feature database creation. In the proposed retrieval system, the pulmonary nodules are segmented using a semi-automated technique, which requires a seed point on the nodule from the end-user. The involvement of radiologists in feature database creation is also reduced, as only a seed point is expected from radiologists instead of manual delineation of the boundary of the nodules. The performance of the retrieval system depends on the accuracy of the segmentation technique. Several 3D features are explored to improve the performance of the proposed retrieval system. A set of relevant shape and texture features are considered for efficient representation of the nodules in the feature space. The proposed CBIR system is evaluated for three configurations such as configuration-1 (composite rank of malignancy "1","2" as benign and "4","5" as malignant), configuration-2 (composite rank of malignancy "1","2", "3" as benign and "4","5" as malignant), and configuration-3 (composite rank of malignancy "1","2" as benign and "3","4","5" as malignant). Considering top 5 retrieved nodules and Euclidean distance metric, the precision achieved by the proposed method for configuration-1, configuration-2, and configuration-3 are 82.14, 75.91, and 74.27 %, respectively. The performance of the proposed CBIR system is close to the most recent technique, which is dependent on radiologists for manual segmentation of nodules. A computer-aided diagnosis (CAD) system is also developed based on CBIR paradigm. Performance of the proposed CBIR-based CAD system is close to performance of the CAD system using support vector machine.

  18. A hybrid approach to parameter identification of linear delay differential equations involving multiple delays

    NASA Astrophysics Data System (ADS)

    Marzban, Hamid Reza

    2018-05-01

    In this paper, we are concerned with the parameter identification of linear time-invariant systems containing multiple delays. The approach is based upon a hybrid of block-pulse functions and Legendre's polynomials. The convergence of the proposed procedure is established and an upper error bound with respect to the L2-norm associated with the hybrid functions is derived. The problem under consideration is first transformed into a system of algebraic equations. The least squares technique is then employed for identification of the desired parameters. Several multi-delay systems of varying complexity are investigated to evaluate the performance and capability of the proposed approximation method. It is shown that the proposed approach is also applicable to a class of nonlinear multi-delay systems. It is demonstrated that the suggested procedure provides accurate results for the desired parameters.

  19. Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios

    NASA Astrophysics Data System (ADS)

    Rao, Parthib; Schaefer, Laura

    2017-11-01

    Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.

  20. New Approach For Prediction Groundwater Depletion

    NASA Astrophysics Data System (ADS)

    Moustafa, Mahmoud

    2017-01-01

    Current approaches to quantify groundwater depletion involve water balance and satellite gravity. However, the water balance technique includes uncertain estimation of parameters such as evapotranspiration and runoff. The satellite method consumes time and effort. The work reported in this paper proposes using failure theory in a novel way to predict groundwater saturated thickness depletion. An important issue in the failure theory proposed is to determine the failure point (depletion case). The proposed technique uses depth of water as the net result of recharge/discharge processes in the aquifer to calculate remaining saturated thickness resulting from the applied pumping rates in an area to evaluate the groundwater depletion. Two parameters, the Weibull function and Bayes analysis were used to model and analyze collected data from 1962 to 2009. The proposed methodology was tested in a nonrenewable aquifer, with no recharge. Consequently, the continuous decline in water depth has been the main criterion used to estimate the depletion. The value of the proposed approach is to predict the probable effect of the current applied pumping rates on the saturated thickness based on the remaining saturated thickness data. The limitation of the suggested approach is that it assumes the applied management practices are constant during the prediction period. The study predicted that after 300 years there would be an 80% probability of the saturated aquifer which would be expected to be depleted. Lifetime or failure theory can give a simple alternative way to predict the remaining saturated thickness depletion with no time-consuming processes such as the sophisticated software required.

  1. Planum Sphenoidale and Tuberculum Sellae Meningiomas: Operative Nuances of a Modern Surgical Technique with Outcome and Proposal of a New Classification System.

    PubMed

    Mortazavi, Martin M; Brito da Silva, Harley; Ferreira, Manuel; Barber, Jason K; Pridgeon, James S; Sekhar, Laligam N

    2016-02-01

    The resection of planum sphenoidale and tuberculum sellae meningiomas is challenging. A universally accepted classification system predicting surgical risk and outcome is still lacking. We report a modern surgical technique specific for planum sphenoidale and tuberculum sellae meningiomas with associated outcome. A new classification system that can guide the surgical approach and may predict surgical risk is proposed. We conducted a retrospective review of the patients who between 2005 and March 2015 underwent a craniotomy or endoscopic surgery for the resection of meningiomas involving the suprasellar region. Operative nuances of a modified frontotemporal craniotomy and orbital osteotomy technique for meningioma removal and reconstruction are described. Twenty-seven patients were found to have tumors arising mainly from the planum sphenoidale or the tuberculum sellae; 25 underwent frontotemporal craniotomy and tumor removal with orbital osteotomy and bilateral optic canal decompression, and 2 patients underwent endonasal transphenoidal resection. The most common presenting symptom was visual disturbance (77%). Vision improved in 90% of those who presented with visual decline, and there was no permanent visual deterioration. Cerebrospinal fluid leak occurred in one of the 25 cranial cases (4%) and in 1 of 2 transphenoidal cases (50%), and in both cases it resolved with treatment. There was no surgical mortality. An orbitotomy and early decompression of the involved optic canal are important for achieving gross total resection, maximizing visual improvement, and avoiding recurrence. The visual outcomes were excellent. A new classification system that can allow the comparison of different series and approaches and indicate cases that are more suitable for an endoscopic transsphenoidal approach is presented. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Automatic topic identification of health-related messages in online health community using text classification.

    PubMed

    Lu, Yingjie

    2013-01-01

    To facilitate patient involvement in online health community and obtain informative support and emotional support they need, a topic identification approach was proposed in this paper for identifying automatically topics of the health-related messages in online health community, thus assisting patients in reaching the most relevant messages for their queries efficiently. Feature-based classification framework was presented for automatic topic identification in our study. We first collected the messages related to some predefined topics in a online health community. Then we combined three different types of features, n-gram-based features, domain-specific features and sentiment features to build four feature sets for health-related text representation. Finally, three different text classification techniques, C4.5, Naïve Bayes and SVM were adopted to evaluate our topic classification model. By comparing different feature sets and different classification techniques, we found that n-gram-based features, domain-specific features and sentiment features were all considered to be effective in distinguishing different types of health-related topics. In addition, feature reduction technique based on information gain was also effective to improve the topic classification performance. In terms of classification techniques, SVM outperformed C4.5 and Naïve Bayes significantly. The experimental results demonstrated that the proposed approach could identify the topics of online health-related messages efficiently.

  3. Nonlinear optical enhancement induced by synergistic effect of graphene nanosheets and CdS nanocrystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Baohua, E-mail: bhzhu@henu.edu.cn, E-mail: yzgu@henu.edu.cn; Cao, Yawan; Wang, Chong

    2016-06-20

    CdS nanocrystals are attached on graphene nanosheets and their nonlinear optical properties are investigated by picosecond Z-scan technique at 532 nm. We found that synergistic effect between the graphene and CdS makes a major enhancement on the nonlinear optical absorption of graphene/CdS nanohybrid in comparison with cooperative effect, and the synergistic improvement is restricted by nonradiative defects in hybrid. The synergistic mechanism involving the local field theory and charge transfer evolution is proposed.

  4. Get Your Requirements Straight: Storyboarding Revisited

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; Luyten, Kris; Coninx, Karin

    Current user-centred software engineering (UCSE) approaches provide many techniques to combine know-how available in multidisciplinary teams. Although the involvement of various disciplines is beneficial for the user experience of the future application, the transition from a user needs analysis to a structured interaction analysis and UI design is not always straightforward. We propose storyboards, enriched by metadata, to specify functional and non-functional requirements. Accompanying tool support should facilitate the creation and use of storyboards. We used a meta-storyboard for the verification of storyboarding approaches.

  5. New gas phase inorganic ion cluster species and their atmospheric implications

    NASA Technical Reports Server (NTRS)

    Maerk, T. D.; Peterson, K. I.; Castleman, A. W., Jr.

    1980-01-01

    Recent experimental laboratory observations, with high-pressure mass spectroscopy, have revealed the existence of previously unreported species involving water clustered to sodium dimer ions, and alkali metal hydroxides clustered to alkali metal ions. The important implications of these results concerning the existence of such species are here discussed, as well as how from a practical aspect they confirm the stability of certain cluster species proposed by Ferguson (1978) to explain masses recently detected at upper altitudes using mass spectrometric techniques.

  6. Flow field description of the Space Shuttle Vernier reaction control system exhaust plumes

    NASA Technical Reports Server (NTRS)

    Cerimele, Mary P.; Alred, John W.

    1987-01-01

    The flow field for the Vernier Reaction Control System (VRCS) jets of the Space Shuttle Orbiter has been calculated from the nozzle throat to the far-field region. The calculations involved the use of recently improved rocket engine nozzle/plume codes. The flow field is discussed, and a brief overview of the calculation techniques is presented. In addition, a proposed on-orbit plume measurement experiment, designed to improve future estimations of the Vernier flow field, is addressed.

  7. Quantum teleportation and entanglement swapping of electron spins in superconducting hybrid structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bubanja, Vladimir, E-mail: vladimir.bubanja@callaghaninnovation.govt.nz

    2015-06-15

    We present schemes for quantum teleportation and entanglement swapping of electronic spin states in hybrid superconductor–normal-metal systems. The proposed schemes employ subgap transport whereby the lowest order processes involve Cooper pair-electron and double Cooper-pair cotunneling in quantum teleportation and entanglement swapping protocols, respectively. The competition between elastic cotunneling and Cooper-pair splitting results in the success probability of 25% in both cases. Described implementations of these protocols are within reach of present-day experimental techniques.

  8. Scrutinizing UML Activity Diagrams

    NASA Astrophysics Data System (ADS)

    Al-Fedaghi, Sabah

    Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.

  9. A Robust Geometric Model for Argument Classification

    NASA Astrophysics Data System (ADS)

    Giannone, Cristina; Croce, Danilo; Basili, Roberto; de Cao, Diego

    Argument classification is the task of assigning semantic roles to syntactic structures in natural language sentences. Supervised learning techniques for frame semantics have been recently shown to benefit from rich sets of syntactic features. However argument classification is also highly dependent on the semantics of the involved lexicals. Empirical studies have shown that domain dependence of lexical information causes large performance drops in outside domain tests. In this paper a distributional approach is proposed to improve the robustness of the learning model against out-of-domain lexical phenomena.

  10. Protein fold recognition using geometric kernel data fusion.

    PubMed

    Zakeri, Pooya; Jeuris, Ben; Vandebril, Raf; Moreau, Yves

    2014-07-01

    Various approaches based on features extracted from protein sequences and often machine learning methods have been used in the prediction of protein folds. Finding an efficient technique for integrating these different protein features has received increasing attention. In particular, kernel methods are an interesting class of techniques for integrating heterogeneous data. Various methods have been proposed to fuse multiple kernels. Most techniques for multiple kernel learning focus on learning a convex linear combination of base kernels. In addition to the limitation of linear combinations, working with such approaches could cause a loss of potentially useful information. We design several techniques to combine kernel matrices by taking more involved, geometry inspired means of these matrices instead of convex linear combinations. We consider various sequence-based protein features including information extracted directly from position-specific scoring matrices and local sequence alignment. We evaluate our methods for classification on the SCOP PDB-40D benchmark dataset for protein fold recognition. The best overall accuracy on the protein fold recognition test set obtained by our methods is ∼ 86.7%. This is an improvement over the results of the best existing approach. Moreover, our computational model has been developed by incorporating the functional domain composition of proteins through a hybridization model. It is observed that by using our proposed hybridization model, the protein fold recognition accuracy is further improved to 89.30%. Furthermore, we investigate the performance of our approach on the protein remote homology detection problem by fusing multiple string kernels. The MATLAB code used for our proposed geometric kernel fusion frameworks are publicly available at http://people.cs.kuleuven.be/∼raf.vandebril/homepage/software/geomean.php?menu=5/. © The Author 2014. Published by Oxford University Press.

  11. Proposed alternative revision strategy for broken S1 pedicle screw: radiological study, review of the literature, and case reports.

    PubMed

    Elgafy, Hossein; Miller, Jacob D; Benedict, Gregory M; Seal, Ryan J; Liu, Jiayong

    2013-07-01

    There have been many reports outlining differing methods for managing a broken S1 screw. To the authors' best knowledge, the technique used in the present study has not been described previously. It involves insertion of a second pedicle screw without removing the broken screw shaft. Radiological study, literature review, and two case reports of the surgical technique. To report a proposed new surgical technique for management of broken S1 pedicle screws. Computed tomography (CT) scans of 50 patients with a total of 100 S1 pedicles were analyzed. There were 25 male and 25 female patients with an average age of 51 years ranging from 36 to 68 years. The cephalad-caudal length, medial-lateral width, and cross-sectional area of the S1 pedicle were measured and compared with the diameter of a pedicle screw to illustrate the possibility of inserting a second screw in S1 pedicle without removal of the broken screw shaft. Two case reports of the proposed technique are presented. The left and right S1 pedicle cross-sectional area in female measured 456.00 ± 4.00 and 457.00 ± 3.00 mm(2), respectively. The left and right S1 pedicle cross-section area in male measured 638.00 ± 2.00 and 639.00 ± 1.00 mm(2), respectively. There were statistically significant differences when comparing male and female S1 pedicle length, width, and cross-sectional area (p<.05). At 2-year follow-up, the two case reports of the proposed technique showed resolution of low back pain and radicular pain. Plain radiograph and CT scan showed posterolateral fusion mass and hardware in good position with no evidence of screw loosening. The S1 pedicle dimensions measured on CT scan reviewed in the present study showed that it may be anatomically feasible to place a second screw through the S1 pedicle without the removal of the broken screw shaft. This treatment method will reduce the complications associated with other described revision strategies for broken S1 screws. Published by Elsevier Inc.

  12. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  13. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  14. Report to TRMM

    NASA Technical Reports Server (NTRS)

    Jameson, Arthur R.

    1997-01-01

    The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several referred publications. addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar Measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.

  15. An Improved Computational Technique for Calculating Electromagnetic Forces and Power Absorptions Generated in Spherical and Deformed Body in Levitation Melting Devices

    NASA Technical Reports Server (NTRS)

    Zong, Jin-Ho; Szekely, Julian; Schwartz, Elliot

    1992-01-01

    An improved computational technique for calculating the electromagnetic force field, the power absorption and the deformation of an electromagnetically levitated metal sample is described. The technique is based on the volume integral method, but represents a substantial refinement; the coordinate transformation employed allows the efficient treatment of a broad class of rotationally symmetrical bodies. Computed results are presented to represent the behavior of levitation melted metal samples in a multi-coil, multi-frequency levitation unit to be used in microgravity experiments. The theoretical predictions are compared with both analytical solutions and with the results or previous computational efforts for the spherical samples and the agreement has been very good. The treatment of problems involving deformed surfaces and actually predicting the deformed shape of the specimens breaks new ground and should be the major usefulness of the proposed method.

  16. Sub-wavelength terahertz beam profiling of a THz source via an all-optical knife-edge technique.

    PubMed

    Phing, Sze Ho; Mazhorova, Anna; Shalaby, Mostafa; Peccianti, Marco; Clerici, Matteo; Pasquazi, Alessia; Ozturk, Yavuz; Ali, Jalil; Morandotti, Roberto

    2015-02-25

    Terahertz technologies recently emerged as outstanding candidates for a variety of applications in such sectors as security, biomedical, pharmaceutical, aero spatial, etc. Imaging the terahertz field, however, still remains a challenge, particularly when sub-wavelength resolutions are involved. Here we demonstrate an all-optical technique for the terahertz near-field imaging directly at the source plane. A thin layer (<100 nm-thickness) of photo carriers is induced on the surface of the terahertz generation crystal, which acts as an all-optical, virtual blade for terahertz near-field imaging via a knife-edge technique. Remarkably, and in spite of the fact that the proposed approach does not require any mechanical probe, such as tips or apertures, we are able to demonstrate the imaging of a terahertz source with deeply sub-wavelength features (<30 μm) directly in its emission plane.

  17. Micro/nano-fabrication technologies for cell biology.

    PubMed

    Qian, Tongcheng; Wang, Yingxiao

    2010-10-01

    Micro/nano-fabrication techniques, such as soft lithography and electrospinning, have been well-developed and widely applied in many research fields in the past decade. Due to the low costs and simple procedures, these techniques have become important and popular for biological studies. In this review, we focus on the studies integrating micro/nano-fabrication work to elucidate the molecular mechanism of signaling transduction in cell biology. We first describe different micro/nano-fabrication technologies, including techniques generating three-dimensional scaffolds for tissue engineering. We then introduce the application of these technologies in manipulating the physical or chemical micro/nano-environment to regulate the cellular behavior and response, such as cell life and death, differentiation, proliferation, and cell migration. Recent advancement in integrating the micro/nano-technologies and live cell imaging are also discussed. Finally, potential schemes in cell biology involving micro/nano-fabrication technologies are proposed to provide perspectives on the future research activities.

  18. Micro/nano-fabrication technologies for cell biology

    PubMed Central

    Qian, Tongcheng

    2012-01-01

    Micro/nano-fabrication techniques, such as soft lithography and electrospinning, have been well-developed and widely applied in many research fields in the past decade. Due to the low costs and simple procedures, these techniques have become important and popular for biological studies. In this review, we focus on the studies integrating micro/nano-fabrication work to elucidate the molecular mechanism of signaling transduction in cell biology. We first describe different micro/nano-fabrication technologies, including techniques generating three-dimensional scaffolds for tissue engineering. We then introduce the application of these technologies in manipulating the physical or chemical micro/nano-environment to regulate the cellular behavior and response, such as cell life and death, differentiation, proliferation, and cell migration. Recent advancement in integrating the micro/nano-technologies and live cell imaging are also discussed. Finally, potential schemes in cell biology involving micro/nano-fabrication technologies are proposed to provide perspectives on the future research activities. PMID:20490938

  19. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  20. Auxiliary principle technique and iterative algorithm for a perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems.

    PubMed

    Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais

    2017-01-01

    In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.

  1. Torque measurement at the single-molecule level.

    PubMed

    Forth, Scott; Sheinin, Maxim Y; Inman, James; Wang, Michelle D

    2013-01-01

    Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single-molecule field have led to the development of techniques that add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study that would be well suited for analysis with torsional measurement techniques.

  2. Describing litho-constrained layout by a high-resolution model filter

    NASA Astrophysics Data System (ADS)

    Tsai, Min-Chun

    2008-05-01

    A novel high-resolution model (HRM) filtering technique was proposed to describe litho-constrained layouts. Litho-constrained layouts are layouts that have difficulties to pattern or are highly sensitive to process-fluctuations under current lithography technologies. HRM applies a short-wavelength (or high NA) model simulation directly on the pre-OPC, original design layout to filter out low spatial-frequency regions, and retain high spatial-frequency components which are litho-constrained. Since no OPC neither mask-synthesis steps are involved, this new technique is highly efficient in run time and can be used in design stage to detect and fix litho-constrained patterns. This method has successfully captured all the hot-spots with less than 15% overshoots on a realistic 80 mm2 full-chip M1 layout in 65nm technology node. A step by step derivation of this HRM technique is presented in this paper.

  3. Report to TRMM

    NASA Technical Reports Server (NTRS)

    Jameson, Arthur R.

    1997-01-01

    The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several refereed publications. In addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.

  4. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  5. Enriching mission planning approach with state transition graph heuristics for deep space exploration

    NASA Astrophysics Data System (ADS)

    Jin, Hao; Xu, Rui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying

    2017-10-01

    As to support the mission of Mars exploration in China, automated mission planning is required to enhance security and robustness of deep space probe. Deep space mission planning requires modeling of complex operations constraints and focus on the temporal state transitions of involved subsystems. Also, state transitions are ubiquitous in physical systems, but have been elusive for knowledge description. We introduce a modeling approach to cope with these difficulties that takes state transitions into consideration. The key technique we build on is the notion of extended states and state transition graphs. Furthermore, a heuristics that based on state transition graphs is proposed to avoid redundant work. Finally, we run comprehensive experiments on selected domains and our techniques present an excellent performance.

  6. Automatic video segmentation and indexing

    NASA Astrophysics Data System (ADS)

    Chahir, Youssef; Chen, Liming

    1999-08-01

    Indexing is an important aspect of video database management. Video indexing involves the analysis of video sequences, which is a computationally intensive process. However, effective management of digital video requires robust indexing techniques. The main purpose of our proposed video segmentation is twofold. Firstly, we develop an algorithm that identifies camera shot boundary. The approach is based on the use of combination of color histograms and block-based technique. Next, each temporal segment is represented by a color reference frame which specifies the shot similarities and which is used in the constitution of scenes. Experimental results using a variety of videos selected in the corpus of the French Audiovisual National Institute are presented to demonstrate the effectiveness of performing shot detection, the content characterization of shots and the scene constitution.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nour, Ali, E-mail: ali.nour@polymtl.ca; Hydro Quebec, Montreal, Quebec, H2L 4P5; Massicotte, Bruno

    This study is aimed at proposing a simple analytical model to investigate the post-cracking behaviour of FRC panels, using an arbitrary tension softening, stress crack opening diagram, as the input. A new relationship that links the crack opening to the panel deflection is proposed. Due to the stochastic nature of material properties, the random fibre distribution, and other uncertainties that are involved in concrete mix, this relationship is developed from the analysis of beams having the same thickness using the Monte Carlo simulation (MCS) technique. The softening diagrams obtained from direct tensile tests are used as the input for themore » calculation, in a deterministic way, of the mean load displacement response of round panels. A good agreement is found between the model predictions and the experimental results.« less

  8. Nonlinear robust control of hypersonic aircrafts with interactions between flight dynamics and propulsion systems.

    PubMed

    Li, Zhaoying; Zhou, Wenjie; Liu, Hao

    2016-09-01

    This paper addresses the nonlinear robust tracking controller design problem for hypersonic vehicles. This problem is challenging due to strong coupling between the aerodynamics and the propulsion system, and the uncertainties involved in the vehicle dynamics including parametric uncertainties, unmodeled model uncertainties, and external disturbances. By utilizing the feedback linearization technique, a linear tracking error system is established with prescribed references. For the linear model, a robust controller is proposed based on the signal compensation theory to guarantee that the tracking error dynamics is robustly stable. Numerical simulation results are given to show the advantages of the proposed nonlinear robust control method, compared to the robust loop-shaping control approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Robust H(infinity) tracking control of boiler-turbine systems.

    PubMed

    Wu, J; Nguang, S K; Shen, J; Liu, G; Li, Y G

    2010-07-01

    In this paper, the problem of designing a fuzzy H(infinity) state feedback tracking control of a boiler-turbine is solved. First, the Takagi and Sugeno fuzzy model is used to model a boiler-turbine system. Next, based on the Takagi and Sugeno fuzzy model, sufficient conditions for the existence of a fuzzy H(infinity) nonlinear state feedback tracking control are derived in terms of linear matrix inequalities. The advantage of the proposed tracking control design is that it does not involve feedback linearization technique and complicated adaptive scheme. An industrial boiler-turbine system is used to illustrate the effectiveness of the proposed design as compared with a linearized approach. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  10. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  11. Enhanced ICP for the Registration of Large-Scale 3D Environment Models: An Experimental Study

    PubMed Central

    Han, Jianda; Yin, Peng; He, Yuqing; Gu, Feng

    2016-01-01

    One of the main applications of mobile robots is the large-scale perception of the outdoor environment. One of the main challenges of this application is fusing environmental data obtained by multiple robots, especially heterogeneous robots. This paper proposes an enhanced iterative closest point (ICP) method for the fast and accurate registration of 3D environmental models. First, a hierarchical searching scheme is combined with the octree-based ICP algorithm. Second, an early-warning mechanism is used to perceive the local minimum problem. Third, a heuristic escape scheme based on sampled potential transformation vectors is used to avoid local minima and achieve optimal registration. Experiments involving one unmanned aerial vehicle and one unmanned surface vehicle were conducted to verify the proposed technique. The experimental results were compared with those of normal ICP registration algorithms to demonstrate the superior performance of the proposed method. PMID:26891298

  12. Singular spectrum decomposition of Bouligand-Minkowski fractal descriptors: an application to the classification of texture Images

    NASA Astrophysics Data System (ADS)

    Florindo, João. Batista

    2018-04-01

    This work proposes the use of Singular Spectrum Analysis (SSA) for the classification of texture images, more specifically, to enhance the performance of the Bouligand-Minkowski fractal descriptors in this task. Fractal descriptors are known to be a powerful approach to model and particularly identify complex patterns in natural images. Nevertheless, the multiscale analysis involved in those descriptors makes them highly correlated. Although other attempts to address this point was proposed in the literature, none of them investigated the relation between the fractal correlation and the well-established analysis employed in time series. And SSA is one of the most powerful techniques for this purpose. The proposed method was employed for the classification of benchmark texture images and the results were compared with other state-of-the-art classifiers, confirming the potential of this analysis in image classification.

  13. Hybrid power system intelligent operation and protection involving distributed architectures and pulsed loads

    NASA Astrophysics Data System (ADS)

    Mohamed, Ahmed

    Efficient and reliable techniques for power delivery and utilization are needed to account for the increased penetration of renewable energy sources in electric power systems. Such methods are also required for current and future demands of plug-in electric vehicles and high-power electronic loads. Distributed control and optimal power network architectures will lead to viable solutions to the energy management issue with high level of reliability and security. This dissertation is aimed at developing and verifying new techniques for distributed control by deploying DC microgrids, involving distributed renewable generation and energy storage, through the operating AC power system. To achieve the findings of this dissertation, an energy system architecture was developed involving AC and DC networks, both with distributed generations and demands. The various components of the DC microgrid were designed and built including DC-DC converters, voltage source inverters (VSI) and AC-DC rectifiers featuring novel designs developed by the candidate. New control techniques were developed and implemented to maximize the operating range of the power conditioning units used for integrating renewable energy into the DC bus. The control and operation of the DC microgrids in the hybrid AC/DC system involve intelligent energy management. Real-time energy management algorithms were developed and experimentally verified. These algorithms are based on intelligent decision-making elements along with an optimization process. This was aimed at enhancing the overall performance of the power system and mitigating the effect of heavy non-linear loads with variable intensity and duration. The developed algorithms were also used for managing the charging/discharging process of plug-in electric vehicle emulators. The protection of the proposed hybrid AC/DC power system was studied. Fault analysis and protection scheme and coordination, in addition to ideas on how to retrofit currently available protection concepts and devices for AC systems in a DC network, were presented. A study was also conducted on the effect of changing the distribution architecture and distributing the storage assets on the various zones of the network on the system's dynamic security and stability. A practical shipboard power system was studied as an example of a hybrid AC/DC power system involving pulsed loads. Generally, the proposed hybrid AC/DC power system, besides most of the ideas, controls and algorithms presented in this dissertation, were experimentally verified at the Smart Grid Testbed, Energy Systems Research Laboratory. All the developments in this dissertation were experimentally verified at the Smart Grid Testbed.

  14. Infrared spectroscopy as a screening technique for colitis

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Ghimire, Hemendra; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil

    2017-05-01

    There remains a great need for diagnosis of inflammatory bowel disease (IBD), for which the current technique, colonoscopy, is not cost-effective and presents a non-negligible risk for complications. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy is a new screening technique to evaluate colitis. Comparing infrared spectra of sera to study the differences between them can prove challenging due to the complexity of its biological constituents giving rise to a plethora of vibrational modes. Overcoming these inherent infrared spectral analysis difficulties involving highly overlapping absorbance peaks and the analysis of the data by curve fitting to improve the resolution is discussed. The proposed technique uses colitic and normal wild type mice dried serum to obtain ATR/FTIR spectra to effectively differentiate colitic mice from normal mice. Using this method, Amide I group frequency (specifically, alpha helix to beta sheet ratio of the protein secondary structure) was identified as disease associated spectral signature in addition to the previously reported glucose and mannose signatures in sera of chronic and acute mice models of colitis. Hence, this technique will be able to identify changes in the sera due to various diseases.

  15. A Secure and Robust Approach to Software Tamper Resistance

    NASA Astrophysics Data System (ADS)

    Ghosh, Sudeep; Hiser, Jason D.; Davidson, Jack W.

    Software tamper-resistance mechanisms have increasingly assumed significance as a technique to prevent unintended uses of software. Closely related to anti-tampering techniques are obfuscation techniques, which make code difficult to understand or analyze and therefore, challenging to modify meaningfully. This paper describes a secure and robust approach to software tamper resistance and obfuscation using process-level virtualization. The proposed techniques involve novel uses of software check summing guards and encryption to protect an application. In particular, a virtual machine (VM) is assembled with the application at software build time such that the application cannot run without the VM. The VM provides just-in-time decryption of the program and dynamism for the application's code. The application's code is used to protect the VM to ensure a level of circular protection. Finally, to prevent the attacker from obtaining an analyzable snapshot of the code, the VM periodically discards all decrypted code. We describe a prototype implementation of these techniques and evaluate the run-time performance of applications using our system. We also discuss how our system provides stronger protection against tampering attacks than previously described tamper-resistance approaches.

  16. Interdental papillary house: a new concept and guide for clinicians.

    PubMed

    Gonzalez, Marly Kimie Sonohara; Almeida, Ana Lucia Pompeia Fraga; Greghi, Sebastiao Luiz Aguiar; Pegoraro, Luiz Fernando; Mondelli, Jose; Moreno, Tatiana

    2011-01-01

    Surgical and nonsurgical techniques have been proposed to regenerate interdental papillae. The results are influenced by the morphology of the interdental space, which is the housing for the papilla. The concept of the interdental papillary "house" has been established not only to allow diagnosis of the causes of papillary loss, but also to manage and predict reconstruction of the interdental gingival tissue. The adjacent teeth in contact, involving the proximal contact, contour and shape of the teeth, course of the cementoenamel junction, interdental distance, and underlying bone crest, determine the outline of the house. Since the components are combined, an understanding of each allows adequate treatment planning involving interdisciplinary procedures. This new concept serves as a guide and teaching aid for the practitioner.

  17. Genetic causes of male infertility.

    PubMed

    Stouffs, Katrien; Seneca, Sara; Lissens, Willy

    2014-05-01

    Male infertility, affecting around half of the couples with a problem to get pregnant, is a very heterogeneous condition. Part of patients are having a defect in spermatogenesis of which the underlying causes (including genetic ones) remain largely unknown. The only genetic tests routinely used in the diagnosis of male infertility are the analyses for the presence of Yq microdeletions and/or chromosomal abnormalities. Various other single gene or polygenic defects have been proposed to be involved in male fertility. Yet, their causative effect often remains to be proven. The recent evolution in the development of whole genome-based techniques may help in clarifying the role of genes and other genetic factors involved in spermatogenesis and spermatogenesis defects. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  18. TripSense: A Trust-Based Vehicular Platoon Crowdsensing Scheme with Privacy Preservation in VANETs

    PubMed Central

    Hu, Hao; Lu, Rongxing; Huang, Cheng; Zhang, Zonghua

    2016-01-01

    In this paper, we propose a trust-based vehicular platoon crowdsensing scheme, named TripSense, in VANET. The proposed TripSense scheme introduces a trust-based system to evaluate vehicles’ sensing abilities and then selects the more capable vehicles in order to improve sensing results accuracy. In addition, the sensing tasks are accomplished by platoon member vehicles and preprocessed by platoon head vehicles before the data are uploaded to server. Hence, it is less time-consuming and more efficient compared with the way where the data are submitted by individual platoon member vehicles. Hence it is more suitable in ephemeral networks like VANET. Moreover, our proposed TripSense scheme integrates unlinkable pseudo-ID techniques to achieve PM vehicle identity privacy, and employs a privacy-preserving sensing vehicle selection scheme without involving the PM vehicle’s trust score to keep its location privacy. Detailed security analysis shows that our proposed TripSense scheme not only achieves desirable privacy requirements but also resists against attacks launched by adversaries. In addition, extensive simulations are conducted to show the correctness and effectiveness of our proposed scheme. PMID:27258287

  19. Reliability based impact localization in composite panels using Bayesian updating and the Kalman filter

    NASA Astrophysics Data System (ADS)

    Morse, Llewellyn; Sharif Khodaei, Zahra; Aliabadi, M. H.

    2018-01-01

    In this work, a reliability based impact detection strategy for a sensorized composite structure is proposed. Impacts are localized using Artificial Neural Networks (ANNs) with recorded guided waves due to impacts used as inputs. To account for variability in the recorded data under operational conditions, Bayesian updating and Kalman filter techniques are applied to improve the reliability of the detection algorithm. The possibility of having one or more faulty sensors is considered, and a decision fusion algorithm based on sub-networks of sensors is proposed to improve the application of the methodology to real structures. A strategy for reliably categorizing impacts into high energy impacts, which are probable to cause damage in the structure (true impacts), and low energy non-damaging impacts (false impacts), has also been proposed to reduce the false alarm rate. The proposed strategy involves employing classification ANNs with different features extracted from captured signals used as inputs. The proposed methodologies are validated by experimental results on a quasi-isotropic composite coupon impacted with a range of impact energies.

  20. Novel techniques for data decomposition and load balancing for parallel processing of vision systems: Implementation and evaluation using a motion estimation system

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.

    1989-01-01

    Computer vision systems employ a sequence of vision algorithms in which the output of an algorithm is the input of the next algorithm in the sequence. Algorithms that constitute such systems exhibit vastly different computational characteristics, and therefore, require different data decomposition techniques and efficient load balancing techniques for parallel implementation. However, since the input data for a task is produced as the output data of the previous task, this information can be exploited to perform knowledge based data decomposition and load balancing. Presented here are algorithms for a motion estimation system. The motion estimation is based on the point correspondence between the involved images which are a sequence of stereo image pairs. Researchers propose algorithms to obtain point correspondences by matching feature points among stereo image pairs at any two consecutive time instants. Furthermore, the proposed algorithms employ non-iterative procedures, which results in saving considerable amounts of computation time. The system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from consecutive time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters.

  1. Characterization of Structure and Damage in Materials in Four Dimensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, I. M.; Schuh, C. A.; Vetrano, J. S.

    2010-09-30

    The materials characterization toolbox has recently experienced a number of parallel revolutionary advances, foreshadowing a time in the near future when materials scientists can quantify material structure across orders of magnitude in length and time scales (i.e., in four dimensions) completely. This paper presents a viewpoint on the materials characterization field, reviewing its recent past, evaluating its present capabilities, and proposing directions for its future development. Electron microscopy; atom-probe tomography; X-ray, neutron and electron tomography; serial sectioning tomography; and diffraction-based analysis methods are reviewed, and opportunities for their future development are highlighted. Particular attention is paid to studies that havemore » pioneered the synergetic use of multiple techniques to provide complementary views of a single structure or process; several of these studies represent the state-of-the-art in characterization, and suggest a trajectory for the continued development of the field. Based on this review, a set of grand challenges for characterization science is identified, including suggestions for instrumentation advances, scientific problems in microstructure analysis, and complex structure evolution problems involving materials damage. The future of microstructural characterization is proposed to be one not only where individual techniques are pushed to their limits, but where the community devises strategies of technique synergy to address complex multiscale problems in materials science and engineering.« less

  2. An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.

    PubMed

    Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad

    2016-01-01

    Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.

  3. Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis

    NASA Technical Reports Server (NTRS)

    Cox, C. F.; Cinnella, P.; Westmoreland, S.

    1996-01-01

    The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.

  4. Two-Stage Thoracoscopic Repair of Long-Gap Esophageal Atresia Using Internal Traction Is Safe and Feasible.

    PubMed

    Tainaka, Takahisa; Uchida, Hiroo; Tanano, Akihide; Shirota, Chiyoe; Hinoki, Akinari; Murase, Naruhiko; Yokota, Kazuki; Oshima, Kazuo; Shirotsuki, Ryo; Chiba, Kosuke; Amano, Hizuru; Kawashima, Hiroshi; Tanaka, Yujiro

    2017-01-01

    The treatment of long-gap esophageal atresia remains an issue for pediatric surgeons. Many techniques for treating long-gap esophageal atresia have been proposed, but the optimal method has not been established. The thoracoscopic esophageal elongation technique has recently been developed. We previously reported a case in which two-stage thoracoscopic repair was performed using internal esophageal traction without esophageal tearing, and we retrospectively reviewed the outcomes of this procedure in this study. Five patients underwent thoracoscopic treatment involving internal esophageal traction for esophageal atresia involving a long gap or vascular ring over a 5-year period. Between November 2010 and November 2015, 5 patients were treated with thoracoscopic traction. All of these patients successfully underwent thoracoscopic-delayed primary anastomosis. Conversion to open thoracotomy was not required in any case. The postoperative complications experienced by the patients included minor anastomotic leakage in 2 cases, anastomotic stenosis in 1 case, gastroesophageal reflux (GER) in 4 cases, and a hiatal hernia in 1 case. None of the patients died. Two-stage thoracoscopic repair for esophageal atresia involving a long gap or vascular ring is a safe and feasible procedure; however, we must develop methods for treating minor anastomotic complications and GER due to esophageal traction in future.

  5. An Assessment of Radiation Modification from a European Perspective

    NASA Astrophysics Data System (ADS)

    Kristjansson, J. E.; Lawrence, M. G.; Boucher, O.; Haywood, J. M.; Irvine, P. J.; Muri, H.; Schmidt, H.; Schulz, M.; Vaughan, N.; Watson, M.; Born, W.; Schaefer, S.; Stelzer, H.

    2014-12-01

    The European Transdisciplinary Assessment of Climate Engineering (EuTRACE) project (2012-2014) is funded by the European Commission (EC). In EuTRACE, researchers from the natural sciences, social sciences and the humanities have joined forces to assess various proposed geoengineering techniques concerning their radiative forcing potential and side effects, ethical aspects, economics aspects, as well as governance and regulation aspects. A comprehensive assessment report will be submitted to the EC in autumn 2014. We will present some highlights of the part of the EuTRACE assessment that deals with the natural science aspects of proposed Radiation Modification (RM) techniques. The techniques considered are: a) Stratospheric Sulfur Injections; b) Marine Cloud Brightening; c) Desert Brightening; d) Vegetation Brightening; and e) Cirrus Cloud Thinning. A large number of publications in the scientific literature has been considered, as well as recently published assessment reports by the Royal Society in the UK and the German Federal Ministry of Education and Research. Some of the findings of the assessment are: Globally averaged, the current anthropogenic radiative forcing could conceivably be offset by the RM techniques considered. The RM techniques could have a significant global effect already after 1 year or less. Model simulations consistently show that Solar RM leads to regional imbalances due to different spatial footprints of solar and carbon dioxide radiative forcings. This may have significant consequences for precipitation patterns and the hydrological cycle. Very rapid warming is virtually certain if RM were to be stopped abruptly or over a period of one to a few years. Model studies of RM usually assume that the techniques are technologically feasible. In fact, the technological challenges are poorly known, and in many cases the physical processes involved are poorly understood. We will end by discussing key research questions and knowledge gaps.

  6. Automatic selection of landmarks in T1-weighted head MRI with regression forests for image registration initialization.

    PubMed

    Wang, Jianing; Liu, Yuan; Noble, Jack H; Dawant, Benoit M

    2017-10-01

    Medical image registration establishes a correspondence between images of biological structures, and it is at the core of many applications. Commonly used deformable image registration methods depend on a good preregistration initialization. We develop a learning-based method to automatically find a set of robust landmarks in three-dimensional MR image volumes of the head. These landmarks are then used to compute a thin plate spline-based initialization transformation. The process involves two steps: (1) identifying a set of landmarks that can be reliably localized in the images and (2) selecting among them the subset that leads to a good initial transformation. To validate our method, we use it to initialize five well-established deformable registration algorithms that are subsequently used to register an atlas to MR images of the head. We compare our proposed initialization method with a standard approach that involves estimating an affine transformation with an intensity-based approach. We show that for all five registration algorithms the final registration results are statistically better when they are initialized with the method that we propose than when a standard approach is used. The technique that we propose is generic and could be used to initialize nonrigid registration algorithms for other applications.

  7. Determination of the pathological state of skin samples by optical polarimetry parameters

    NASA Astrophysics Data System (ADS)

    Fanjul-Vélez, F.; Ortega-Quijano, N.; Buelta, L.; Arce-Diego, J. L.

    2008-11-01

    Polarimetry is widely known to involve a series of powerful optical techniques that characterize the polarization behaviour of a sample. In this work, we propose a method for applying polarimetric procedures to the characterization of biological tissues, in order to differentiate between healthy and pathologic tissues on a polarimetric basis. Usually, medical morphology diseases are diagnosed based on histological alterations of the tissue. The fact that these alterations will be reflected in polarization information highlights the suitability of polarimetric procedures for diagnostic purposes. The analysis is mainly focused on the depolarization properties of the media, as long as the internal structure strongly affects the polarization state of the light that interacts with the sample. Therefore, a method is developed in order to determine the correlation between pathological ultraestructural characteristics and the subsequent variations in the polarimetric parameters of the backscattered light. This study is applied to three samples of porcine skin corresponding to a healthy region, a mole, and a cancerous region. The results show that the method proposed is indeed an adequate technique in order to achieve an early, accurate and effective cancer detection.

  8. Using postural synergies to animate a low-dimensional hand avatar in haptic simulation.

    PubMed

    Mulatto, Sara; Formaglio, Alessandro; Malvezzi, Monica; Prattichizzo, Domenico

    2013-01-01

    A technique to animate a realistic hand avatar with 20 DoFs based on the biomechanics of the human hand is presented. The animation does not use any sensor glove or advanced tracker with markers. The proposed approach is based on the knowledge of a set of kinematic constraints on the model of the hand, referred to as postural synergies, which allows to represent the hand posture using a number of variables lower than the number of joints of the hand model. This low-dimensional set of parameters is estimated from direct measurement of the motion of thumb and index finger tracked using two haptic devices. A kinematic inversion algorithm has been developed, which takes synergies into account and estimates the kinematic configuration of the whole hand, i.e., also of the fingers whose end tips are not directly tracked by the two haptic devices. The hand skin is deformable and its deformation is computed using a linear vertex blending technique. The proposed synergy-based animation of the hand avatar involves only algebraic computations and is suitable for real-time implementation as required in haptics.

  9. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  10. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  11. A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples

    NASA Astrophysics Data System (ADS)

    Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.

    Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.

  12. Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique

    PubMed Central

    Riaz, Muhammad Mohsin; Ghafoor, Abdul

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  13. Liver fibrosis in human immunodeficiency virus/hepatitis C virus coinfection: Diagnostic methods and clinical impact

    PubMed Central

    Sagnelli, Caterina; Martini, Salvatore; Pisaturo, Mariantonietta; Pasquale, Giuseppe; Macera, Margherita; Zampino, Rosa; Coppola, Nicola; Sagnelli, Evangelista

    2015-01-01

    Several non-invasive surrogate methods have recently challenged the main role of liver biopsy in assessing liver fibrosis in hepatitis C virus (HCV)-monoinfected and human immunodeficiency virus (HIV)/HCV-coinfected patients, applied to avoid the well-known side effects of liver puncture. Serological tests involve the determination of biochemical markers of synthesis or degradation of fibrosis, tests not readily available in clinical practice, or combinations of routine tests used in chronic hepatitis and HIV/HCV coinfection. Several radiologic techniques have also been proposed, some of which commonly used in clinical practice. The studies performed to compare the prognostic value of non-invasive surrogate methods with that of the degree of liver fibrosis assessed on liver tissue have not as yet provided conclusive results. Each surrogate technique has shown some limitations, including the risk of over- or under-estimating the extent of liver fibrosis. The current knowledge on liver fibrosis in HIV/HCV-coinfected patients will be summarized in this review article, which is addressed in particular to physicians involved in this setting in their clinical practice. PMID:26523204

  14. Integrating surveillance data on water-related diseases and drinking-water quality; action-research in a Brazilian municipality.

    PubMed

    Queiroz, Ana Carolina Lanza; Cardoso, Laís Santos de Magalhães; Heller, Léo; Cairncross, Sandy

    2015-12-01

    The Brazilian Ministry of Health proposed a research study involving municipal professional staff conducting both epidemiological and water quality surveillance to facilitate the integration of the data which they collected. It aimed to improve the intersectoral collaboration and health promotion activities in the municipalities, especially regarding drinking-water quality. We then conducted a study using the action-research approach. At its evaluation phase, a technique which we called 'the tree analogy' was applied in order to identify both possibilities and challenges related to the proposed interlinkage. Results showed that integrating the two data collection systems cannot be attained without prior institutional adjustments. It suggests therefore the necessity to unravel issues that go beyond the selection and the interrelation of indicators and compatibility of software, to include political, administrative and personal matters. The evaluation process led those involved to re-think their practice by sharing experiences encountered in everyday practice, and formulating constructive criticisms. All this inevitably unleashes a process of empowerment. From this perspective, we have certainly gathered some fruit from the Tree, but not necessarily the most visible.

  15. Postextraction Dental Implant in the Aesthetic Zone, Socket Shield Technique Versus Conventional Protocol.

    PubMed

    Bramanti, Ennio; Norcia, Antonio; Cicciù, Marco; Matacena, Giada; Cervino, Gabriele; Troiano, Giuseppe; Zhurakivska, Khrystyna; Laino, Luigi

    2018-06-01

    The aim of this randomized controlled trial was to evaluate the survival rate, the marginal bone level, and the aesthetic outcome; at 3 years' follow-up, of dental implants placed into a high-esthetic aesthetic zone by comparing 2 techniques of postextraction implant with immediate loading: the socket shied technique and the conventional insertion technique.Several clinical studies suggested that the avulsion of a dental element causes dimensional alterations of both soft and hard tissues at the postextractive site. To increase the aesthetic outcomes, the "socket-shield technique" has been proposed. This method involves maintaining the vestibular root portion and immediate insertion of the dental implant in close proximity to the root.Patients enrolled in this study were randomized to receive a postextraction implant in the aesthetic zone, either with the socket shied technique or with the conventional insertion technique. Implant survival, marginal bone level, and the pink aesthetic score were the outcomes evaluated.Implant survival rate was 100% in both the groups at 3 years. Implants inserted with the socket shield technique showed better values of both marginal bone level and pink aesthetic score (P < 0.05).Although such preliminary results need to be further confirmed, the socket shield technique seems to be a safe surgical technique that allows an implant rehabilitation characterized by better aesthetic outcomes.

  16. New layer-based imaging and rapid prototyping techniques for computer-aided design and manufacture of custom dental restoration.

    PubMed

    Lee, M-Y; Chang, C-C; Ku, Y C

    2008-01-01

    Fixed dental restoration by conventional methods greatly relies on the skill and experience of the dental technician. The quality and accuracy of the final product depends mostly on the technician's subjective judgment. In addition, the traditional manual operation involves many complex procedures, and is a time-consuming and labour-intensive job. Most importantly, no quantitative design and manufacturing information is preserved for future retrieval. In this paper, a new device for scanning the dental profile and reconstructing 3D digital information of a dental model based on a layer-based imaging technique, called abrasive computer tomography (ACT) was designed in-house and proposed for the design of custom dental restoration. The fixed partial dental restoration was then produced by rapid prototyping (RP) and computer numerical control (CNC) machining methods based on the ACT scanned digital information. A force feedback sculptor (FreeForm system, Sensible Technologies, Inc., Cambridge MA, USA), which comprises 3D Touch technology, was applied to modify the morphology and design of the fixed dental restoration. In addition, a comparison of conventional manual operation and digital manufacture using both RP and CNC machining technologies for fixed dental restoration production is presented. Finally, a digital custom fixed restoration manufacturing protocol integrating proposed layer-based dental profile scanning, computer-aided design, 3D force feedback feature modification and advanced fixed restoration manufacturing techniques is illustrated. The proposed method provides solid evidence that computer-aided design and manufacturing technologies may become a new avenue for custom-made fixed restoration design, analysis, and production in the 21st century.

  17. Early detection of glaucoma using fully automated disparity analysis of the optic nerve head (ONH) from stereo fundus images

    NASA Astrophysics Data System (ADS)

    Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.

    2006-03-01

    Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.

  18. Evaluating biomechanical properties of murine embryos using Brillouin microscopy and optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Raghunathan, Raksha; Zhang, Jitao; Wu, Chen; Rippy, Justin; Singh, Manmohan; Larin, Kirill V.; Scarcelli, Giuliano

    2017-08-01

    Embryogenesis is regulated by numerous changes in mechanical properties of the cellular microenvironment. Thus, studying embryonic mechanophysiology can provide a more thorough perspective of embryonic development, potentially improving early detection of congenital abnormalities as well as evaluating and developing therapeutic interventions. A number of methods and techniques have been used to study cellular biomechanical properties during embryogenesis. While some of these techniques are invasive or involve the use of external agents, others are compromised in terms of spatial and temporal resolutions. We propose the use of Brillouin microscopy in combination with optical coherence tomography (OCT) to measure stiffness as well as structural changes in a developing embryo. While Brillouin microscopy assesses the changes in stiffness among different organs of the embryo, OCT provides the necessary structural guidance.

  19. PROMETHEE II: A knowledge-driven method for copper exploration

    NASA Astrophysics Data System (ADS)

    Abedi, Maysam; Ali Torabi, S.; Norouzi, Gholam-Hossain; Hamzeh, Mohammad; Elyasi, Gholam-Reza

    2012-09-01

    This paper describes the application of a well-known Multi Criteria Decision Making (MCDM) technique called Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE II) to explore porphyry copper deposits. Various raster-based evidential layers involving geological, geophysical, and geochemical geo-datasets are integrated to prepare a mineral prospectivity mapping (MPM). In a case study, thirteen layers of the Now Chun copper deposit located in the Kerman province of Iran are used to explore the region of interest. The PROMETHEE II technique is applied to produce the desired MPM, and the outputs are validated using twenty-one boreholes that have been classified into five classes. This proposed method shows a high performance when providing the MPM while reducing the cost of exploratory drilling in the study area.

  20. Diagonalizing the Hamiltonian of λϕ4 theory in 2 space-time dimensions

    NASA Astrophysics Data System (ADS)

    Christensen, Neil

    2018-01-01

    We propose a new non-perturbative technique for calculating the scattering amplitudes of field-theory directly from the eigenstates of the Hamiltonian. Our method involves a discretized momentum space and a momentum cutoff, thereby truncating the Hilbert space and making numerical diagonalization of the Hamiltonian achievable. We show how to do this in the context of a simplified λϕ4 theory in two space-time dimensions. We present the results of our diagonalization, its dependence on time, its dependence on the parameters of the theory and its renormalization.

  1. Copper Corrosion and Biocorrosion Events in Premise Plumbing

    PubMed Central

    Fischer, Diego A.; Alsina, Marco A.; Pastén, Pablo A.

    2017-01-01

    Corrosion of copper pipes may release high amounts of copper into the water, exceeding the maximum concentration of copper for drinking water standards. Typically, the events with the highest release of copper into drinking water are related to the presence of biofilms. This article reviews this phenomenon, focusing on copper ingestion and its health impacts, the physicochemical mechanisms and the microbial involvement on copper release, the techniques used to describe and understand this phenomenon, and the hydrodynamic effects. A conceptual model is proposed and the mathematical models are reviewed. PMID:28872628

  2. Multidimensional Hermite-Gaussian quadrature formulae and their application to nonlinear estimation

    NASA Technical Reports Server (NTRS)

    Mcreynolds, S. R.

    1975-01-01

    A simplified technique is proposed for calculating multidimensional Hermite-Gaussian quadratures that involves taking the square root of a matrix by the Cholesky algorithm rather than computation of the eigenvectors of the matrix. Ways of reducing the dimension, number, and order of the quadratures are set forth. If the function f(x) under the integral sign is not well approximated by a low-order algebraic expression, the order of the quadrature may be reduced by factoring f(x) into an expression that is nearly algebraic and one that is Gaussian.

  3. Cellular solidification in a monotectic system

    NASA Technical Reports Server (NTRS)

    Kaukler, W. F.; Curreri, P. A.

    1987-01-01

    Succinonitrile-glycerol, SN-G, transparent organic monotectic alloy is studied with particular attention to cellular growth. The phase diagram is determined, near the monotectic composition, with greater accuracy than previous studies. A solidification interface stability diagram is determined for planar growth. The planar-to-cellular transition is compared to predictions from the Burton, Primm, Schlichter theory. A new technique to determine the solute segregation by Fourier transform infrared spectroscopy is developed. Proposed models that involve the cellular interface for alignment of monotectic second-phase spheres or rods are compared with observations.

  4. Inclusion for People with Developmental Disabilities: Measuring an Elusive Construct.

    PubMed

    Neely-Barnes, Susan Louise; Elswick, Susan E

    2016-01-01

    The philosophy of inclusion for people with intellectual and developmental disabilities (IDD) has evolved over the last 50 years. Over time, inclusion research has shifted from a focus on deinstitutionalization to understanding the extent to which individuals with IDD are meaningfully involved in the community and social relationships. Yet, there has been no agreed on way to measure inclusion. Many different measurement and data collection techniques have been used in the literature. This study proposes a brief measure of inclusion that can be used with family members and on survey instruments.

  5. An Approach to Vicinal t-Boc-Amino Dibromides via Catalytic Aminobromination of Nitrostyrenes without using Chromatography and Recrystallization

    PubMed Central

    Sun, Hao; Han, Jianlin; Kattamuri, Padmanabha V.; Pan, Yi; Li, Guigen

    2013-01-01

    1.0 % Mol of K3PO4·3H2O was found to catalyze aminohalogenation reaction of nitrostyrenes with N,N-dibromo-tert-butylcarbamate (t-Boc-NBr2) in dichloroethane system. Good to excellent yields and complete regioselectivity have been achieved by taking advantage of the GAP work-up without using traditional purification techniques such as column chromatography and recrystallization. New mechanism was proposed involving radical and ionic catalytic cycles and an intramolecular migration. PMID:23311641

  6. Copper Corrosion and Biocorrosion Events in Premise Plumbing.

    PubMed

    Vargas, Ignacio T; Fischer, Diego A; Alsina, Marco A; Pavissich, Juan P; Pastén, Pablo A; Pizarro, Gonzalo E

    2017-09-05

    Corrosion of copper pipes may release high amounts of copper into the water, exceeding the maximum concentration of copper for drinking water standards. Typically, the events with the highest release of copper into drinking water are related to the presence of biofilms. This article reviews this phenomenon, focusing on copper ingestion and its health impacts, the physicochemical mechanisms and the microbial involvement on copper release, the techniques used to describe and understand this phenomenon, and the hydrodynamic effects. A conceptual model is proposed and the mathematical models are reviewed.

  7. Surgical options in benign parotid tumors: a proposal for classification.

    PubMed

    Quer, Miquel; Vander Poorten, Vincent; Takes, Robert P; Silver, Carl E; Boedeker, Carsten C; de Bree, Remco; Rinaldo, Alessandra; Sanabria, Alvaro; Shaha, Ashok R; Pujol, Albert; Zbären, Peter; Ferlito, Alfio

    2017-11-01

    Different surgical options are currently available for treating benign tumors of the parotid gland, and the discussion on optimal treatment continues despite several meta-analyses. These options include more limited resections (extracapsular dissection, partial lateral parotidectomy) versus more extensive and traditional options (lateral parotid lobectomy, total parotidectomy). Different schools favor one option or another based on their experience, skills and tradition. This review provides a critical analysis of the literature regarding these options. The main limitation of all the studies is the bias of selection for different surgical approaches. For this reason, we propose a staging system that could facilitate clinical decision making and the comparison of results. We propose four categories based on the size of the tumor and its location within the parotid gland. Category I includes tumors up to 3 cm, which are mobile, close to the outer surface and close to the parotid borders. Category II includes deeper tumors up to 3 cm. Category III comprises tumors greater than 3 cm involving two levels of the parotid gland, and category IV tumors are greater than 3 cm and involve more than 2 levels. For each category and for the various pathologic types, a guideline of surgical extent is proposed. The objective of this classification is to facilitate prospective multicentric studies on surgical techniques in the treatment of benign parotid tumors and to enable the comparison of results of different clinical studies.

  8. Deep learning architecture for recognition of abnormal activities

    NASA Astrophysics Data System (ADS)

    Khatrouch, Marwa; Gnouma, Mariem; Ejbali, Ridha; Zaied, Mourad

    2018-04-01

    The video surveillance is one of the key areas in computer vision researches. The scientific challenge in this field involves the implementation of automatic systems to obtain detailed information about individuals and groups behaviors. In particular, the detection of abnormal movements of groups or individuals requires a fine analysis of frames in the video stream. In this article, we propose a new method to detect anomalies in crowded scenes. We try to categorize the video in a supervised mode accompanied by unsupervised learning using the principle of the autoencoder. In order to construct an informative concept for the recognition of these behaviors, we use a technique of representation based on the superposition of human silhouettes. The evaluation of the UMN dataset demonstrates the effectiveness of the proposed approach.

  9. Enhanced low-temperature lithium storage performance of multilayer graphene made through an improved ionic liquid-assisted synthesis

    NASA Astrophysics Data System (ADS)

    Raccichini, Rinaldo; Varzi, Alberto; Chakravadhanula, Venkata Sai Kiran; Kübel, Christian; Balducci, Andrea; Passerini, Stefano

    2015-05-01

    The electrochemical properties of graphene are strongly depending on its synthesis. Between the different methods proposed so far, liquid phase exfoliation turns out to be a promising method for the production of graphene. Unfortunately, the low yield of this technique, in term of solid material obtained, still limit its use to small scale applications. In this article we propose a low cost and environmentally friendly method for producing multilayer crystalline graphene with high yield. Such innovative approach, involving an improved ionic liquid assisted, microwave exfoliation of expanded graphite, allows the production of graphene with advanced lithium ion storage performance, for the first time, at low temperatures (<0 °C), as low as -30 °C, with respect to commercially available graphite.

  10. Neural architecture design based on extreme learning machine.

    PubMed

    Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis

    2013-12-01

    Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Edge compression techniques for visualization of dense directed graphs.

    PubMed

    Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher

    2013-12-01

    We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.

  12. Modified kernel-based nonlinear feature extraction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, J.; Perkins, S. J.; Theiler, J. P.

    2002-01-01

    Feature Extraction (FE) techniques are widely used in many applications to pre-process data in order to reduce the complexity of subsequent processes. A group of Kernel-based nonlinear FE ( H E ) algorithms has attracted much attention due to their high performance. However, a serious limitation that is inherent in these algorithms -- the maximal number of features extracted by them is limited by the number of classes involved -- dramatically degrades their flexibility. Here we propose a modified version of those KFE algorithms (MKFE), This algorithm is developed from a special form of scatter-matrix, whose rank is not determinedmore » by the number of classes involved, and thus breaks the inherent limitation in those KFE algorithms. Experimental results suggest that MKFE algorithm is .especially useful when the training set is small.« less

  13. Process modelling for space station experiments

    NASA Technical Reports Server (NTRS)

    Rosenberger, Franz; Alexander, J. Iwan D.

    1988-01-01

    The work performed during the first year 1 Oct. 1987 to 30 Sept. 1988 involved analyses of crystal growth from the melt and from solution. The particular melt growth technique under investigation is directional solidification by the Bridgman-Stockbarger method. Two types of solution growth systems are also being studied. One involves growth from solution in a closed container, the other concerns growth of protein crystals by the hanging drop method. Following discussions with Dr. R. J. Naumann of the Low Gravity Science Division at MSFC it was decided to tackle the analysis of crystal growth from the melt earlier than originally proposed. Rapid progress was made in this area. Work is on schedule and full calculations were underway for some time. Progress was also made in the formulation of the two solution growth models.

  14. Object extraction method for image synthesis

    NASA Astrophysics Data System (ADS)

    Inoue, Seiki

    1991-11-01

    The extraction of component objects from images is fundamentally important for image synthesis. In TV program production, one useful method is the Video-Matte technique for specifying the necessary boundary of an object. This, however, involves some manually intricate and tedious processes. A new method proposed in this paper can reduce the needed level of operator skill and simplify object extraction. The object is automatically extracted by just a simple drawing of a thick boundary line. The basic principle involves a thinning of the thick boundary line binary image using the edge intensity of the original image. This method has many practical advantages, including the simplicity of specifying an object, the high accuracy of thinned-out boundary line, its ease of application to moving images, and the lack of any need for adjustment.

  15. Advances in Testing Techniques for Digital Microfluidic Biochips

    PubMed Central

    Shukla, Vineeta; Hussin, Fawnizu Azmadi; Hamid, Nor Hisham; Zain Ali, Noohul Basheer

    2017-01-01

    With the advancement of digital microfluidics technology, applications such as on-chip DNA analysis, point of care diagnosis and automated drug discovery are common nowadays. The use of Digital Microfluidics Biochips (DMFBs) in disease assessment and recognition of target molecules had become popular during the past few years. The reliability of these DMFBs is crucial when they are used in various medical applications. Errors found in these biochips are mainly due to the defects developed during droplet manipulation, chip degradation and inaccuracies in the bio-assay experiments. The recently proposed Micro-electrode-dot Array (MEDA)-based DMFBs involve both fluidic and electronic domains in the micro-electrode cell. Thus, the testing techniques for these biochips should be revised in order to ensure proper functionality. This paper describes recent advances in the testing technologies for digital microfluidics biochips, which would serve as a useful platform for developing revised/new testing techniques for MEDA-based biochips. Therefore, the relevancy of these techniques with respect to testing of MEDA-based biochips is analyzed in order to exploit the full potential of these biochips. PMID:28749411

  16. Single-molecule comparison of DNA Pol I activity with native and analog nucleotides

    NASA Astrophysics Data System (ADS)

    Gul, Osman; Olsen, Tivoli; Choi, Yongki; Corso, Brad; Weiss, Gregory; Collins, Philip

    2014-03-01

    DNA polymerases are critical enzymes for DNA replication, and because of their complex catalytic cycle they are excellent targets for investigation by single-molecule experimental techniques. Recently, we studied the Klenow fragment (KF) of DNA polymerase I using a label-free, electronic technique involving single KF molecules attached to carbon nanotube transistors. The electronic technique allowed long-duration monitoring of a single KF molecule while processing thousands of template strands. Processivity of up to 42 nucleotide bases was directly observed, and statistical analysis of the recordings determined key kinetic parameters for the enzyme's open and closed conformations. Subsequently, we have used the same technique to compare the incorporation of canonical nucleotides like dATP to analogs like 1-thio-2'-dATP. The analog had almost no affect on duration of the closed conformation, during which the nucleotide is incorporated. On the other hand, the analog increased the rate-limiting duration of the open conformation by almost 40%. We propose that the thiolated analog interferes with KF's recognition and binding, two key steps that determine its ensemble turnover rate.

  17. Advances in Testing Techniques for Digital Microfluidic Biochips.

    PubMed

    Shukla, Vineeta; Hussin, Fawnizu Azmadi; Hamid, Nor Hisham; Zain Ali, Noohul Basheer

    2017-07-27

    With the advancement of digital microfluidics technology, applications such as on-chip DNA analysis, point of care diagnosis and automated drug discovery are common nowadays. The use of Digital Microfluidics Biochips (DMFBs) in disease assessment and recognition of target molecules had become popular during the past few years. The reliability of these DMFBs is crucial when they are used in various medical applications. Errors found in these biochips are mainly due to the defects developed during droplet manipulation, chip degradation and inaccuracies in the bio-assay experiments. The recently proposed Micro-electrode-dot Array (MEDA)-based DMFBs involve both fluidic and electronic domains in the micro-electrode cell. Thus, the testing techniques for these biochips should be revised in order to ensure proper functionality. This paper describes recent advances in the testing technologies for digital microfluidics biochips, which would serve as a useful platform for developing revised/new testing techniques for MEDA-based biochips. Therefore, the relevancy of these techniques with respect to testing of MEDA-based biochips is analyzed in order to exploit the full potential of these biochips.

  18. Location of planar targets in three space from monocular images

    NASA Technical Reports Server (NTRS)

    Cornils, Karin; Goode, Plesent W.

    1987-01-01

    Many pieces of existing and proposed space hardware that would be targets of interest for a telerobot can be represented as planar or near-planar surfaces. Examples include the biostack modules on the Long Duration Exposure Facility, the panels on Solar Max, large diameter struts, and refueling receptacles. Robust and temporally efficient methods for locating such objects with sufficient accuracy are therefore worth developing. Two techniques that derive the orientation and location of an object from its monocular image are discussed and the results of experiments performed to determine translational and rotational accuracy are presented. Both the quadrangle projection and elastic matching techniques extract three-space information using a minimum of four identifiable target points and the principles of the perspective transformation. The selected points must describe a convex polygon whose geometric characteristics are prespecified in a data base. The rotational and translational accuracy of both techniques was tested at various ranges. This experiment is representative of the sensing requirements involved in a typical telerobot target acquisition task. Both techniques determined target location to an accuracy sufficient for consistent and efficient acquisition by the telerobot.

  19. Distributed reinforcement learning for adaptive and robust network intrusion response

    NASA Astrophysics Data System (ADS)

    Malialis, Kleanthis; Devlin, Sam; Kudenko, Daniel

    2015-07-01

    Distributed denial of service (DDoS) attacks constitute a rapidly evolving threat in the current Internet. Multiagent Router Throttling is a novel approach to defend against DDoS attacks where multiple reinforcement learning agents are installed on a set of routers and learn to rate-limit or throttle traffic towards a victim server. The focus of this paper is on online learning and scalability. We propose an approach that incorporates task decomposition, team rewards and a form of reward shaping called difference rewards. One of the novel characteristics of the proposed system is that it provides a decentralised coordinated response to the DDoS problem, thus being resilient to DDoS attacks themselves. The proposed system learns remarkably fast, thus being suitable for online learning. Furthermore, its scalability is successfully demonstrated in experiments involving 1000 learning agents. We compare our approach against a baseline and a popular state-of-the-art throttling technique from the network security literature and show that the proposed approach is more effective, adaptive to sophisticated attack rate dynamics and robust to agent failures.

  20. Collaborative care: Using six thinking hats for decision making.

    PubMed

    Cioffi, Jane Marie

    2017-12-01

    To apply six thinking hats technique for decision making in collaborative care. In collaborative partnerships, effective communications need to occur in patient, family, and health care professional meetings. The effectiveness of these meetings depends on the engagement of participants and the quality of the meeting process. The use of six thinking hats technique to engage all participants in effective dialogue is proposed. Discussion paper. Electronic databases, CINAHL, Pub Med, and Science Direct, were searched for years 1990 to 2017. Using six thinking hats technique in patient family meetings nurses can guide a process of dialogue that focuses decision making to build equal care partnerships inclusive of all participants. Nurses will need to develop the skills for using six thinking hats technique and provide support to all participants during the meeting process. Collaborative decision making can be augmented by six thinking hat technique to provide patients, families, and health professionals with opportunities to make informed decisions about care that considers key issues for all involved. Nurses who are most often advocates for patients and their families are in a unique position to lead this initiative in meetings as they network with all health professionals. © 2017 John Wiley & Sons Australia, Ltd.

  1. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Wavelet filtered shifted phase-encoded joint transform correlation for face recognition

    NASA Astrophysics Data System (ADS)

    Moniruzzaman, Md.; Alam, Mohammad S.

    2017-05-01

    A new wavelet-filtered-based Shifted- phase-encoded Joint Transform Correlation (WPJTC) technique has been proposed for efficient face recognition. The proposed technique uses discrete wavelet decomposition for preprocessing and can effectively accommodate various 3D facial distortions, effects of noise, and illumination variations. After analyzing different forms of wavelet basis functions, an optimal method has been proposed by considering the discrimination capability and processing speed as performance trade-offs. The proposed technique yields better correlation discrimination compared to alternate pattern recognition techniques such as phase-shifted phase-encoded fringe-adjusted joint transform correlator. The performance of the proposed WPJTC has been tested using the Yale facial database and extended Yale facial database under different environments such as illumination variation, noise, and 3D changes in facial expressions. Test results show that the proposed WPJTC yields better performance compared to alternate JTC based face recognition techniques.

  3. Total body water measurements using resonant cavity perturbation techniques.

    PubMed

    Stone, Darren A; Robinson, Martin P

    2004-05-07

    A recent paper proposed a novel technique for determining the total body water (TBW) of patients suffering with abnormal hydration levels, using a resonant cavity perturbation method. Current techniques to measure TBW are limited by resolution and technical constraints. However, this new method involves measuring the dielectric properties of the body, by placing a subject in a large cavity resonator and measuring the subsequent change in its resonant frequency, fres and its Q-factor. Utilizing the relationship that water content correlates to these dielectric properties, it has been shown that the measured response of these parameters enables determination of TBW. Results are presented for a preliminary study using data estimated from anthropometric measurements, where volunteers were asked to lie and stand in an electromagnetic screened room, before and after drinking between 1 and 2 l of water, and in some cases, after voiding the bladder. Notable changes in the parameters were observed; fres showed a negative shift and Q was reduced. Preliminary calibration curves using estimated values of water content have been developed from these results, showing that for each subject the measured resonant frequency is a linear function of TBW. Because the gradients of these calibration curves correlate to the mass-to-height-ratio of the volunteers, it has proved that a system in which TBW can be unequivocally obtained is possible. Measured values of TBW have been determined using this new pilot-technique, and the values obtained correlate well with theoretical values of body water (r = 0.87) and resolution is very good (750 ml). The results obtained are measurable, repeatable and statistically significant. This leads to confidence in the integrity of the proposed technique.

  4. Total body water measurements using resonant cavity perturbation techniques

    NASA Astrophysics Data System (ADS)

    Stone, Darren A.; Robinson, Martin P.

    2004-05-01

    A recent paper proposed a novel technique for determining the total body water (TBW) of patients suffering with abnormal hydration levels, using a resonant cavity perturbation method. Current techniques to measure TBW are limited by resolution and technical constraints. However, this new method involves measuring the dielectric properties of the body, by placing a subject in a large cavity resonator and measuring the subsequent change in its resonant frequency, fres and its Q-factor. Utilizing the relationship that water content correlates to these dielectric properties, it has been shown that the measured response of these parameters enables determination of TBW. Results are presented for a preliminary study using data estimated from anthropometric measurements, where volunteers were asked to lie and stand in an electromagnetic screened room, before and after drinking between 1 and 2 l of water, and in some cases, after voiding the bladder. Notable changes in the parameters were observed; fres showed a negative shift and Q was reduced. Preliminary calibration curves using estimated values of water content have been developed from these results, showing that for each subject the measured resonant frequency is a linear function of TBW. Because the gradients of these calibration curves correlate to the mass-to-height-ratio of the volunteers, it has proved that a system in which TBW can be unequivocally obtained is possible. Measured values of TBW have been determined using this new pilot-technique, and the values obtained correlate well with theoretical values of body water (r = 0.87) and resolution is very good (750 ml). The results obtained are measurable, repeatable and statistically significant. This leads to confidence in the integrity of the proposed technique.

  5. A Novel Image Encryption Based on Algebraic S-box and Arnold Transform

    NASA Astrophysics Data System (ADS)

    Farwa, Shabieh; Muhammad, Nazeer; Shah, Tariq; Ahmad, Sohail

    2017-09-01

    Recent study shows that substitution box (S-box) only cannot be reliably used in image encryption techniques. We, in this paper, propose a novel and secure image encryption scheme that utilizes the combined effect of an algebraic substitution box along with the scrambling effect of the Arnold transform. The underlying algorithm involves the application of S-box, which is the most imperative source to create confusion and diffusion in the data. The speciality of the proposed algorithm lies, firstly, in the high sensitivity of our S-box to the choice of the initial conditions which makes this S-box stronger than the chaos-based S-boxes as it saves computational labour by deploying a comparatively simple and direct approach based on the algebraic structure of the multiplicative cyclic group of the Galois field. Secondly the proposed method becomes more secure by considering a combination of S-box with certain number of iterations of the Arnold transform. The strength of the S-box is examined in terms of various performance indices such as nonlinearity, strict avalanche criterion, bit independence criterion, linear and differential approximation probabilities etc. We prove through the most significant techniques used for the statistical analyses of the encrypted image that our image encryption algorithm satisfies all the necessary criteria to be usefully and reliably implemented in image encryption applications.

  6. Measuring glomerular number from kidney MRI images

    NASA Astrophysics Data System (ADS)

    Thiagarajan, Jayaraman J.; Natesan Ramamurthy, Karthikeyan; Kanberoglu, Berkay; Frakes, David; Bennett, Kevin; Spanias, Andreas

    2016-03-01

    Measuring the glomerular number in the entire, intact kidney using non-destructive techniques is of immense importance in studying several renal and systemic diseases. Commonly used approaches either require destruction of the entire kidney or perform extrapolation from measurements obtained from a few isolated sections. A recent magnetic resonance imaging (MRI) method, based on the injection of a contrast agent (cationic ferritin), has been used to effectively identify glomerular regions in the kidney. In this work, we propose a robust, accurate, and low-complexity method for estimating the number of glomeruli from such kidney MRI images. The proposed technique has a training phase and a low-complexity testing phase. In the training phase, organ segmentation is performed on a few expert-marked training images, and glomerular and non-glomerular image patches are extracted. Using non-local sparse coding to compute similarity and dissimilarity graphs between the patches, the subspace in which the glomerular regions can be discriminated from the rest are estimated. For novel test images, the image patches extracted after pre-processing are embedded using the discriminative subspace projections. The testing phase is of low computational complexity since it involves only matrix multiplications, clustering, and simple morphological operations. Preliminary results with MRI data obtained from five kidneys of rats show that the proposed non-invasive, low-complexity approach performs comparably to conventional approaches such as acid maceration and stereology.

  7. Planning and executing motions for multibody systems in free-fall. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.

    1991-01-01

    The purpose of this research is to develop an end-to-end system that can be applied to a multibody system in free-fall to analyze its possible motions, save those motions in a database, and design a controller that can execute those motions. A goal is for the process to be highly automated and involve little human intervention. Ideally, the output of the system would be data and algorithms that could be put in ROM to control the multibody system in free-fall. The research applies to more than just robots in space. It applies to any multibody system in free-fall. Mathematical techniques from nonlinear control theory were used to study the nature of the system dynamics and its possible motions. Optimization techniques were applied to plan motions. Image compression techniques were proposed to compress the precomputed motion data for storage. A linearized controller was derived to control the system while it executes preplanned trajectories.

  8. A-posteriori error estimation for second order mechanical systems

    NASA Astrophysics Data System (ADS)

    Ruiner, Thomas; Fehr, Jörg; Haasdonk, Bernard; Eberhard, Peter

    2012-06-01

    One important issue for the simulation of flexible multibody systems is the reduction of the flexible bodies degrees of freedom. As far as safety questions are concerned knowledge about the error introduced by the reduction of the flexible degrees of freedom is helpful and very important. In this work, an a-posteriori error estimator for linear first order systems is extended for error estimation of mechanical second order systems. Due to the special second order structure of mechanical systems, an improvement of the a-posteriori error estimator is achieved. A major advantage of the a-posteriori error estimator is that the estimator is independent of the used reduction technique. Therefore, it can be used for moment-matching based, Gramian matrices based or modal based model reduction techniques. The capability of the proposed technique is demonstrated by the a-posteriori error estimation of a mechanical system, and a sensitivity analysis of the parameters involved in the error estimation process is conducted.

  9. Efficient Stochastic Rendering of Static and Animated Volumes Using Visibility Sweeps.

    PubMed

    von Radziewsky, Philipp; Kroes, Thomas; Eisemann, Martin; Eisemann, Elmar

    2017-09-01

    Stochastically solving the rendering integral (particularly visibility) is the de-facto standard for physically-based light transport but it is computationally expensive, especially when displaying heterogeneous volumetric data. In this work, we present efficient techniques to speed-up the rendering process via a novel visibility-estimation method in concert with an unbiased importance sampling (involving environmental lighting and visibility inside the volume), filtering, and update techniques for both static and animated scenes. Our major contributions include a progressive estimate of partial occlusions based on a fast sweeping-plane algorithm. These occlusions are stored in an octahedral representation, which can be conveniently transformed into a quadtree-based hierarchy suited for a joint importance sampling. Further, we propose sweep-space filtering, which suppresses the occurrence of fireflies and investigate different update schemes for animated scenes. Our technique is unbiased, requires little precomputation, is highly parallelizable, and is applicable to a various volume data sets, dynamic transfer functions, animated volumes and changing environmental lighting.

  10. Torque Measurement at the Single Molecule Level

    PubMed Central

    Forth, Scott; Sheinin, Maxim Y.; Inman, James; Wang, Michelle D.

    2017-01-01

    Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single molecule field have led to the development of techniques which add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study which would be well suited for analysis with torsional measurement techniques. PMID:23541162

  11. Three-Dimensional Printing of Medicinal Products and the Challenge of Personalized Therapy.

    PubMed

    Zema, Lucia; Melocchi, Alice; Maroni, Alessandra; Gazzaniga, Andrea

    2017-07-01

    By 3-dimensional (3D) printing, solid objects of any shape are fabricated through layer-by-layer addition of materials based on a digital model. At present, such a technique is broadly exploited in many industrial fields because of major advantages in terms of reduced times and costs of development and production. In the biomedical and pharmaceutical domains, the interest in 3D printing is growing in step with the needs of personalized medicine. Printed scaffolds and prostheses have partly replaced medical devices produced by more established techniques, and more recently, 3D printing has been proposed for the manufacturing of drug products. Notably, the availability of patient-tailored pharmaceuticals would be of utmost importance for children, elderly subjects, poor and high metabolizers, and individuals undergoing multiple drug treatments. 3D printing encompasses a range of differing techniques, each involving advantages and open issues. Particularly, solidification of powder, extrusion, and stereolithography have been applied to the manufacturing of drug products. The main challenge to their exploitation for personalized pharmacologic therapy is likely to be related to the regulatory issues involved and to implementation of production models that may allow to efficiently turn the therapeutic needs of individual patients into small batches of appropriate drug products meeting preset quality requirements. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. 45 CFR 2102.10 - Timing, scope and content of submissions for proposed projects involving land, buildings, or...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... proposed projects involving land, buildings, or other structures. 2102.10 Section 2102.10 Public Welfare... for proposed projects involving land, buildings, or other structures. (a) A party proposing a project... historical information about the building or other structure to be altered or razed; (ii) The identity of the...

  13. Freehand three-dimensional ultrasound imaging of carotid artery using motion tracking technology.

    PubMed

    Chung, Shao-Wen; Shih, Cho-Chiang; Huang, Chih-Chung

    2017-02-01

    Ultrasound imaging has been extensively used for determining the severity of carotid atherosclerotic stenosis. In particular, the morphological characterization of carotid plaques can be performed for risk stratification of patients. However, using 2D ultrasound imaging for detecting morphological changes in plaques has several limitations. Due to the scan was performed on a single longitudinal cross-section, the selected 2D image is difficult to represent the entire morphology and volume of plaque and vessel lumen. In addition, the precise positions of 2D ultrasound images highly depend on the radiologists' experience, it makes the serial long-term exams of anti-atherosclerotic therapies are difficult to relocate the same corresponding planes by using 2D B-mode images. This has led to the recent development of three-dimensional (3D) ultrasound imaging, which offers improved visualization and quantification of complex morphologies of carotid plaques. In the present study, a freehand 3D ultrasound imaging technique based on optical motion tracking technology is proposed. Unlike other optical tracking systems, the marker is a small rigid body that is attached to the ultrasound probe and is tracked by eight high-performance digital cameras. The probe positions in 3D space coordinates are then calibrated at spatial and temporal resolutions of 10μm and 0.01s, respectively. The image segmentation procedure involves Otsu's and the active contour model algorithms and accurately detects the contours of the carotid arteries. The proposed imaging technique was verified using normal artery and atherosclerotic stenosis phantoms. Human experiments involving freehand scanning of the carotid artery of a volunteer were also performed. The results indicated that compared with manual segmentation, the lowest percentage errors of the proposed segmentation procedure were 7.8% and 9.1% for the external and internal carotid arteries, respectively. Finally, the effect of handshaking was calibrated using the optical tracking system for reconstructing a 3D image. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  15. Moving object detection and tracking in videos through turbulent medium

    NASA Astrophysics Data System (ADS)

    Halder, Kalyan Kumar; Tahtali, Murat; Anavatti, Sreenatha G.

    2016-06-01

    This paper addresses the problem of identifying and tracking moving objects in a video sequence having a time-varying background. This is a fundamental task in many computer vision applications, though a very challenging one because of turbulence that causes blurring and spatiotemporal movements of the background images. Our proposed approach involves two major steps. First, a moving object detection algorithm that deals with the detection of real motions by separating the turbulence-induced motions using a two-level thresholding technique is used. In the second step, a feature-based generalized regression neural network is applied to track the detected objects throughout the frames in the video sequence. The proposed approach uses the centroid and area features of the moving objects and creates the reference regions instantly by selecting the objects within a circle. Simulation experiments are carried out on several turbulence-degraded video sequences and comparisons with an earlier method confirms that the proposed approach provides a more effective tracking of the targets.

  16. Alternative Speech Communication System for Persons with Severe Speech Disorders

    NASA Astrophysics Data System (ADS)

    Selouani, Sid-Ahmed; Sidi Yakoub, Mohammed; O'Shaughnessy, Douglas

    2009-12-01

    Assistive speech-enabled systems are proposed to help both French and English speaking persons with various speech disorders. The proposed assistive systems use automatic speech recognition (ASR) and speech synthesis in order to enhance the quality of communication. These systems aim at improving the intelligibility of pathologic speech making it as natural as possible and close to the original voice of the speaker. The resynthesized utterances use new basic units, a new concatenating algorithm and a grafting technique to correct the poorly pronounced phonemes. The ASR responses are uttered by the new speech synthesis system in order to convey an intelligible message to listeners. Experiments involving four American speakers with severe dysarthria and two Acadian French speakers with sound substitution disorders (SSDs) are carried out to demonstrate the efficiency of the proposed methods. An improvement of the Perceptual Evaluation of the Speech Quality (PESQ) value of 5% and more than 20% is achieved by the speech synthesis systems that deal with SSD and dysarthria, respectively.

  17. Improving Prediction Accuracy for WSN Data Reduction by Applying Multivariate Spatio-Temporal Correlation

    PubMed Central

    Carvalho, Carlos; Gomes, Danielo G.; Agoulmine, Nazim; de Souza, José Neuman

    2011-01-01

    This paper proposes a method based on multivariate spatial and temporal correlation to improve prediction accuracy in data reduction for Wireless Sensor Networks (WSN). Prediction of data not sent to the sink node is a technique used to save energy in WSNs by reducing the amount of data traffic. However, it may not be very accurate. Simulations were made involving simple linear regression and multiple linear regression functions to assess the performance of the proposed method. The results show a higher correlation between gathered inputs when compared to time, which is an independent variable widely used for prediction and forecasting. Prediction accuracy is lower when simple linear regression is used, whereas multiple linear regression is the most accurate one. In addition to that, our proposal outperforms some current solutions by about 50% in humidity prediction and 21% in light prediction. To the best of our knowledge, we believe that we are probably the first to address prediction based on multivariate correlation for WSN data reduction. PMID:22346626

  18. The road maintenance funding models in Indonesia use earmarked tax

    NASA Astrophysics Data System (ADS)

    Gultom, Tiopan Henry M.; Tamin, Ofyar Z.; Sjafruddin, Ade; Pradono

    2017-11-01

    One of the solutions to get a sustainable road maintenance fund is to separate road sector revenue from other accounts, afterward, form a specific account for road maintenance. In 2001, Antameng and the Ministry of Public Works proposed a road fund model in Indonesia. Sources of the road funds proposal was a tariff formed on the nominal total tax. The policy of road funds was proposed to finance the road network maintenance of districts and provincials. This research aims to create a policy model of road maintenance funds in Indonesia using an earmarked tax mechanism. The research method is qualitative research, with data collection techniques are triangulation. Interview methods conducted were semi-structured. Strength, Weakness, Opportunities, and Threat from every part of the models were showen on the survey format. Respondents were representative of executives who involved directly against the financing of road maintenance. Validation model conducted by a discussion panel, it was called the Focus Group Discussion (FGD). The FGD involved all selected respondents. Road maintenance financing model that most appropriately applied in Indonesia was a model of revenue source use an earmarked PBBKB, PKB and PPnBM. Revenue collection mechanism was added tariff of registered vehicle tax (PKB), Vehicle Fuel Tax (PBBKB) and the luxury vehicle sales tax (PPnBM). The funds are managed at the provincial level by a public service agency.

  19. Numerical Solution of the Electron Heat Transport Equation and Physics-Constrained Modeling of the Thermal Conductivity via Sequential Quadratic Programming Optimization in Nuclear Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Paloma, Cynthia S.

    The plasma electron temperature (Te) plays a critical role in a tokamak nu- clear fusion reactor since temperatures on the order of 108K are required to achieve fusion conditions. Many plasma properties in a tokamak nuclear fusion reactor are modeled by partial differential equations (PDE's) because they depend not only on time but also on space. In particular, the dynamics of the electron temperature is governed by a PDE referred to as the Electron Heat Transport Equation (EHTE). In this work, a numerical method is developed to solve the EHTE based on a custom finite-difference technique. The solution of the EHTE is compared to temperature profiles obtained by using TRANSP, a sophisticated plasma transport code, for specific discharges from the DIII-D tokamak, located at the DIII-D National Fusion Facility in San Diego, CA. The thermal conductivity (also called thermal diffusivity) of the electrons (Xe) is a plasma parameter that plays a critical role in the EHTE since it indicates how the electron temperature diffusion varies across the minor effective radius of the tokamak. TRANSP approximates Xe through a curve-fitting technique to match experimentally measured electron temperature profiles. While complex physics-based model have been proposed for Xe, there is a lack of a simple mathematical model for the thermal diffusivity that could be used for control design. In this work, a model for Xe is proposed based on a scaling law involving key plasma variables such as the electron temperature (Te), the electron density (ne), and the safety factor (q). An optimization algorithm is developed based on the Sequential Quadratic Programming (SQP) technique to optimize the scaling factors appearing in the proposed model so that the predicted electron temperature and magnetic flux profiles match predefined target profiles in the best possible way. A simulation study summarizing the outcomes of the optimization procedure is presented to illustrate the potential of the proposed modeling method.

  20. Creating photo-realistic works in a 3D scene using layers styles to create an animation

    NASA Astrophysics Data System (ADS)

    Avramescu, A. M.

    2015-11-01

    Creating realist objects in a 3D scene is not an easy work. We have to be very careful to make the creation very detailed. If we don't know how to make these photo-realistic works, by using the techniques and a good reference photo we can create an amazing amount of detail and realism. For example, in this article there are some of these detailed methods from which we can learn the techniques necessary to make beautiful and realistic objects in a scene. More precisely, in this paper, we present how to create a 3D animated scene, mainly using the Pen Tool and Blending Options. Indeed, this work is based on teaching some simple ways of using the Layer Styles to create some great shadows, lights, textures and a realistic sense of 3 Dimension. The present work involves also showing how some interesting ways of using the illuminating and rendering options can create a realistic effect in a scene. Moreover, this article shows how to create photo realistic 3D models from a digital image. The present work proposes to present how to use Illustrator paths, texturing, basic lighting and rendering, how to apply textures and how to parent the building and objects components. We also propose to use this proposition to recreate smaller details or 3D objects from a 2D image. After a critic art stage, we are able now to present in this paper the architecture of a design method that proposes to create an animation. The aim is to create a conceptual and methodological tutorial to address this issue both scientifically and in practice. This objective also includes proposing, on strong scientific basis, a model that gives the possibility of a better understanding of the techniques necessary to create a realistic animation.

  1. Detection of the KIT D816V mutation in peripheral blood of systemic mastocytosis: diagnostic implications.

    PubMed

    Jara-Acevedo, Maria; Teodosio, Cristina; Sanchez-Muñoz, Laura; Álvarez-Twose, Ivan; Mayado, Andrea; Caldas, Carolina; Matito, Almudena; Morgado, José M; Muñoz-González, Javier I; Escribano, Luis; Garcia-Montero, Andrés C; Orfao, Alberto

    2015-08-01

    Recent studies have found the KIT D816V mutation in peripheral blood of virtually all adult systemic mastocytosis patients once highly sensitive PCR techniques were used; thus, detection of the KIT D816V mutation in peripheral blood has been proposed to be included in the diagnostic work-up of systemic mastocytosis algorithms. However, the precise frequency of the mutation, the biological significance of peripheral blood-mutated cells and their potential association with involvement of bone marrow hematopoietic cells other than mast cells still remain to be investigated. Here, we determined the frequency of peripheral blood involvement by the KIT D816V mutation, as assessed by two highly sensitive PCR methods, and investigated its relationship with multilineage involvement of bone marrow hematopoiesis. Overall, our results confirmed the presence of the KIT D816V mutation in peripheral blood of most systemic mastocytosis cases (161/190; 85%)--with an increasing frequency from indolent systemic mastocytosis without skin lesions (29/44; 66%) to indolent systemic mastocytosis with skin involvement (124/135; 92%), and more aggressive disease subtypes (11/11; 100%)--as assessed by the allele-specific oligonucleotide-qPCR method, which was more sensitive (P<.0001) than the peptide nucleic acid-mediated PCR approach (84/190; 44%). Although the presence of the KIT mutation in peripheral blood, as assessed by the allele-specific oligonucleotide-qPCR technique, did not accurately predict for multilineage bone marrow involvement of hematopoiesis, the allele-specific oligonucleotide-qPCR allele burden and the peptide nucleic acid-mediated-PCR approach did. These results suggest that both methods provide clinically useful and complementary information through the identification and/or quantification of the KIT D816V mutation in peripheral blood of patients suspected of systemic mastocytosis.

  2. Medical physics personnel for medical imaging: requirements, conditions of involvement and staffing levels-French recommendations.

    PubMed

    Isambert, Aurélie; Le Du, Dominique; Valéro, Marc; Guilhem, Marie-Thérèse; Rousse, Carole; Dieudonné, Arnaud; Blanchard, Vincent; Pierrat, Noëlle; Salvat, Cécile

    2015-04-01

    The French regulations concerning the involvement of medical physicists in medical imaging procedures are relatively vague. In May 2013, the ASN and the SFPM issued recommendations regarding Medical Physics Personnel for Medical Imaging: Requirements, Conditions of Involvement and Staffing Levels. In these recommendations, the various areas of activity of medical physicists in radiology and nuclear medicine have been identified and described, and the time required to perform each task has been evaluated. Criteria for defining medical physics staffing levels are thus proposed. These criteria are defined according to the technical platform, the procedures and techniques practised on it, the number of patients treated and the number of persons in the medical and paramedical teams requiring periodic training. The result of this work is an aid available to each medical establishment to determine their own needs in terms of medical physics. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Assessment of the upper motor neuron in amyotrophic lateral sclerosis.

    PubMed

    Huynh, William; Simon, Neil G; Grosskreutz, Julian; Turner, Martin R; Vucic, Steve; Kiernan, Matthew C

    2016-07-01

    Clinical signs of upper motor neuron (UMN) involvement are an important component in supporting the diagnosis of amyotrophic lateral sclerosis (ALS), but are often not easily appreciated in a limb that is concurrently affected by muscle wasting and lower motor neuron degeneration, particularly in the early symptomatic stages of ALS. Whilst recent criteria have been proposed to facilitate improved detection of lower motor neuron impairment through electrophysiological features that have improved diagnostic sensitivity, assessment of upper motor neuron involvement remains essentially clinical. As a result, there is often a significant diagnostic delay that in turn may impact institution of disease-modifying therapy and access to other optimal patient management. Biomarkers of pathological UMN involvement are also required to ensure patients with suspected ALS have timely access to appropriate therapeutic trials. The present review provides an analysis of current and recently developed assessment techniques, including novel imaging and electrophysiological approaches used to study corticomotoneuronal pathology in ALS. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Key Issues Concerning Biolog Use for Aerobic and Anaerobic Freshwater Bacterial Community-Level Physiological Profiling

    NASA Astrophysics Data System (ADS)

    Christian, Bradley W.; Lind, Owen T.

    2006-06-01

    Bacterial heterotrophy in aquatic ecosystems is important in the overall carbon cycle. Biolog MicroPlates provide information into the metabolic potential of bacteria involved in carbon cycling. Specifically, Biolog EcoPlatesTM were developed with ecologically relevant carbon substrates to allow investigators to measure carbon substrate utilization patterns and develop community-level physiological profiles from natural bacterial assemblages. However, understanding of the functionality of these plates in freshwater research is limited. We explored several issues of EcoPlate use for freshwater bacterial assemblages including inoculum density, incubation temperature, non-bacterial color development, and substrate selectivity. Each of these has various effects on plate interpretation. We offer suggestions and techniques to resolve these interpretation issues. Lastly we propose a technique to allow EcoPlate use in anaerobic freshwater bacterial studies.

  5. Application of optical coherence tomography for noninvasive blood glucose monitoring during hyperglycemia

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ashitkov, Taras V.; Motamedi, Massoud; Esenaliev, Rinat O.

    2003-10-01

    Approximately 14 million people in the USA and more than 140 million people worldwide suffer from Diabetes Mellitus. The current glucose sensing technique involves a finger puncture several times a day to obtain a droplet of blood for chemical analysis. Recently we proposed to use optical coherence tomography (OCT) for continuous noninvasive blood glucose sensing through skin. In this paper we tested the OCT technique for noninvasive monitoring of blood glucose concentration in lip tissue of New Zealand rabbits and Yucatan micropigs during glucose clamping experiments. Obtained results show good agreement with results obtained in skin studies, good correlation of changes in the OCT signal slope measured at the depth of 250 to 500 μm with changes in blood glucose concentration, and higher stability of the OCT data points than that obtained from skin.

  6. Sequencing proteins with transverse ionic transport in nanochannels.

    PubMed

    Boynton, Paul; Di Ventra, Massimiliano

    2016-05-03

    De novo protein sequencing is essential for understanding cellular processes that govern the function of living organisms and all sequence modifications that occur after a protein has been constructed from its corresponding DNA code. By obtaining the order of the amino acids that compose a given protein one can then determine both its secondary and tertiary structures through structure prediction, which is used to create models for protein aggregation diseases such as Alzheimer's Disease. Here, we propose a new technique for de novo protein sequencing that involves translocating a polypeptide through a synthetic nanochannel and measuring the ionic current of each amino acid through an intersecting perpendicular nanochannel. We find that the distribution of ionic currents for each of the 20 proteinogenic amino acids encoded by eukaryotic genes is statistically distinct, showing this technique's potential for de novo protein sequencing.

  7. Discrimination of dynamical system models for biological and chemical processes.

    PubMed

    Lorenz, Sönke; Diederichs, Elmar; Telgmann, Regina; Schütte, Christof

    2007-06-01

    In technical chemistry, systems biology and biotechnology, the construction of predictive models has become an essential step in process design and product optimization. Accurate modelling of the reactions requires detailed knowledge about the processes involved. However, when concerned with the development of new products and production techniques for example, this knowledge often is not available due to the lack of experimental data. Thus, when one has to work with a selection of proposed models, the main tasks of early development is to discriminate these models. In this article, a new statistical approach to model discrimination is described that ranks models wrt. the probability with which they reproduce the given data. The article introduces the new approach, discusses its statistical background, presents numerical techniques for its implementation and illustrates the application to examples from biokinetics.

  8. Potentiometric chip-based multipumping flow system for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples.

    PubMed

    Chango, Gabriela; Palacio, Edwin; Cerdà, Víctor

    2018-08-15

    A simple potentiometric chip-based multipumping flow system (MPFS) has been developed for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples. The proposed system was developed by using a poly(methyl methacrylate) chip microfluidic-conductor using the advantages of flow techniques with potentiometric detection. For this purpose, an automatic system has been designed and built by optimizing the variables involved in the process, such as: pH, ionic strength, stirring and sample volume. This system was applied successfully to water samples getting a versatile system with an analysis frequency of 12 samples per hour. Good correlation between chloride and fluoride concentration measured with ISE and ionic chromatography technique suggests satisfactory reliability of the system. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Cost reduction in space operations - Structuring a planetary program to minimize the annual funding requirement as opposed to minimizing the program runout cost

    NASA Technical Reports Server (NTRS)

    Herman, D. H.; Niehoff, J. C.; Spadoni, D. J.

    1980-01-01

    An approach is proposed for the structuring of a planetary mission set wherein the peak annual funding is minimized to meet the annual budget restraint. One aspect of the approach is to have a transportation capability that can launch a mission in any planetary opportunity; such capability can be provided by solar electric propulsion. Another cost reduction technique is to structure a mission test in a time sequenced fashion that could utilize essentially the same spacecraft for the implementation of several missions. A third technique would be to fulfill a scientific objective in several sequential missions rather than attempt to accomplish all of the objectives with one mission. The application of the approach is illustrated by an example involving the Solar Orbiter Dual Probe mission.

  10. Multiple Replica Repulsion Technique for Efficient Conformational Sampling of Biological Systems

    PubMed Central

    Malevanets, Anatoly; Wodak, Shoshana J.

    2011-01-01

    Here, we propose a technique for sampling complex molecular systems with many degrees of freedom. The technique, termed “multiple replica repulsion” (MRR), does not suffer from poor scaling with the number of degrees of freedom associated with common replica exchange procedures and does not require sampling at high temperatures. The algorithm involves creation of multiple copies (replicas) of the system, which interact with one another through a repulsive potential that can be applied to the system as a whole or to portions of it. The proposed scheme prevents oversampling of the most populated states and provides accurate descriptions of conformational perturbations typically associated with sampling ground-state energy wells. The performance of MRR is illustrated for three systems of increasing complexity. A two-dimensional toy potential surface is used to probe the sampling efficiency as a function of key parameters of the procedure. MRR simulations of the Met-enkephalin pentapeptide, and the 76-residue protein ubiquitin, performed in presence of explicit water molecules and totaling 32 ns each, investigate the ability of MRR to characterize the conformational landscape of the peptide, and the protein native basin, respectively. Results obtained for the enkephalin peptide reflect more closely the extensive conformational flexibility of this peptide than previously reported simulations. Those obtained for ubiquitin show that conformational ensembles sampled by MRR largely encompass structural fluctuations relevant to biological recognition, which occur on the microsecond timescale, or are observed in crystal structures of ubiquitin complexes with other proteins. MRR thus emerges as a very promising simple and versatile technique for modeling the structural plasticity of complex biological systems. PMID:21843487

  11. Hardware Neural Network for a Visual Inspection System

    NASA Astrophysics Data System (ADS)

    Chun, Seungwoo; Hayakawa, Yoshihiro; Nakajima, Koji

    The visual inspection of defects in products is heavily dependent on human experience and instinct. In this situation, it is difficult to reduce the production costs and to shorten the inspection time and hence the total process time. Consequently people involved in this area desire an automatic inspection system. In this paper, we propose a hardware neural network, which is expected to provide high-speed operation for automatic inspection of products. Since neural networks can learn, this is a suitable method for self-adjustment of criteria for classification. To achieve high-speed operation, we use parallel and pipelining techniques. Furthermore, we use a piecewise linear function instead of a conventional activation function in order to save hardware resources. Consequently, our proposed hardware neural network achieved 6GCPS and 2GCUPS, which in our test sample proved to be sufficiently fast.

  12. Detecting time-specific differences between temporal nonlinear curves: Analyzing data from the visual world paradigm

    PubMed Central

    Oleson, Jacob J; Cavanaugh, Joseph E; McMurray, Bob; Brown, Grant

    2015-01-01

    In multiple fields of study, time series measured at high frequencies are used to estimate population curves that describe the temporal evolution of some characteristic of interest. These curves are typically nonlinear, and the deviations of each series from the corresponding curve are highly autocorrelated. In this scenario, we propose a procedure to compare the response curves for different groups at specific points in time. The method involves fitting the curves, performing potentially hundreds of serially correlated tests, and appropriately adjusting the overall alpha level of the tests. Our motivating application comes from psycholinguistics and the visual world paradigm. We describe how the proposed technique can be adapted to compare fixation curves within subjects as well as between groups. Our results lead to conclusions beyond the scope of previous analyses. PMID:26400088

  13. Spatial aliasing for efficient direction-of-arrival estimation based on steering vector reconstruction

    NASA Astrophysics Data System (ADS)

    Yan, Feng-Gang; Cao, Bin; Rong, Jia-Jia; Shen, Yi; Jin, Ming

    2016-12-01

    A new technique is proposed to reduce the computational complexity of the multiple signal classification (MUSIC) algorithm for direction-of-arrival (DOA) estimate using a uniform linear array (ULA). The steering vector of the ULA is reconstructed as the Kronecker product of two other steering vectors, and a new cost function with spatial aliasing at hand is derived. Thanks to the estimation ambiguity of this spatial aliasing, mirror angles mathematically relating to the true DOAs are generated, based on which the full spectral search involved in the MUSIC algorithm is highly compressed into a limited angular sector accordingly. Further complexity analysis and performance studies are conducted by computer simulations, which demonstrate that the proposed estimator requires an extremely reduced computational burden while it shows a similar accuracy to the standard MUSIC.

  14. A consensus least squares support vector regression (LS-SVR) for analysis of near-infrared spectra of plant samples.

    PubMed

    Li, Yankun; Shao, Xueguang; Cai, Wensheng

    2007-04-15

    Consensus modeling of combining the results of multiple independent models to produce a single prediction avoids the instability of single model. Based on the principle of consensus modeling, a consensus least squares support vector regression (LS-SVR) method for calibrating the near-infrared (NIR) spectra was proposed. In the proposed approach, NIR spectra of plant samples were firstly preprocessed using discrete wavelet transform (DWT) for filtering the spectral background and noise, then, consensus LS-SVR technique was used for building the calibration model. With an optimization of the parameters involved in the modeling, a satisfied model was achieved for predicting the content of reducing sugar in plant samples. The predicted results show that consensus LS-SVR model is more robust and reliable than the conventional partial least squares (PLS) and LS-SVR methods.

  15. Porosity estimation of aged mortar using a micromechanical model.

    PubMed

    Hernández, M G; Anaya, J J; Sanchez, T; Segura, I

    2006-12-22

    Degradation of concrete structures located in high humidity atmospheres or under flowing water is a very important problem. In this study, a method for ultrasonic non-destructive characterization in aged mortar is presented. The proposed method makes a prediction of the behaviour of aged mortar accomplished with a three phase micromechanical model using ultrasonic measurements. Aging mortar was accelerated by immersing the probes in ammonium nitrate solution. Both destructive and non-destructive characterization of mortar was performed. Destructive tests of porosity were performed using a vacuum saturation method and non-destructive characterization was carried out using ultrasonic velocities. Aging experiments show that mortar degradation not only involves a porosity increase, but also microstructural changes in the cement matrix. Experimental results show that the estimated porosity using the proposed non-destructive methodology had a comparable performance to classical destructive techniques.

  16. Invisible ink mark detection in the visible spectrum using absorption difference.

    PubMed

    Lee, Joong; Kong, Seong G; Kang, Tae-Yi; Kim, Byounghyun; Jeon, Oc-Yeub

    2014-03-01

    One of popular techniques in gambling fraud involves the use of invisible ink marks printed on the back surface of playing cards. Such covert patterns are transparent in the visible spectrum and therefore invisible to unaided human eyes. Invisible patterns can be made visible with ultraviolet (UV) illumination or a CCD camera installed with an infrared (IR) filter depending on the type of ink materials used. Cheating gamers often wear contact lenses or eyeglasses made of IR or UV filters to recognize the secret marks on the playing cards. This paper presents an image processing technique to reveal invisible ink patterns in the visible spectrum without the aid of special equipment such as UV lighting or IR filters. A printed invisible ink pattern leaves a thin coating on the surface with different refractive index for different wavelengths of light, which results in color dispersion or absorption difference. The proposed method finds the differences of color components caused by absorption difference to detect invisible ink patterns on the surface. Experiment results show that the proposed scheme is effective for both UV-active and IR-active invisible ink materials. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Integrating Retraction Modeling Into an Atlas-Based Framework for Brain Shift Prediction

    PubMed Central

    Chen, Ishita; Ong, Rowena E.; Simpson, Amber L.; Sun, Kay; Thompson, Reid C.

    2015-01-01

    In recent work, an atlas-based statistical model for brain shift prediction, which accounts for uncertainty in the intraoperative environment, has been proposed. Previous work reported in the literature using this technique did not account for local deformation caused by surgical retraction. It is challenging to precisely localize the retractor location prior to surgery and the retractor is often moved in the course of the procedure. This paper proposes a technique that involves computing the retractor-induced brain deformation in the operating room through an active model solve and linearly superposing the solution with the precomputed deformation atlas. As a result, the new method takes advantage of the atlas-based framework’s accounting for uncertainties while also incorporating the effects of retraction with minimal intraoperative computing. This new approach was tested using simulation and phantom experiments. The results showed an improvement in average shift correction from 50% (ranging from 14 to 81%) for gravity atlas alone to 80% using the active solve retraction component (ranging from 73 to 85%). This paper presents a novel yet simple way to integrate retraction into the atlas-based brain shift computation framework. PMID:23864146

  18. Toward Model Building for Visual Aesthetic Perception

    PubMed Central

    Lughofer, Edwin; Zeng, Xianyi

    2017-01-01

    Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194

  19. Ancient DNA in historical parchments - identifying a procedure for extraction and amplification of genetic material.

    PubMed

    Lech, T

    2016-05-06

    Historical parchments in the form of documents, manuscripts, books, or letters, make up a large portion of cultural heritage collections. Their priceless historical value is associated with not only their content, but also the information hidden in the DNA deposited on them. Analyses of ancient DNA (aDNA) retrieved from parchments can be used in various investigations, including, but not limited to, studying their authentication, tracing the development of the culture, diplomacy, and technology, as well as obtaining information on the usage and domestication of animals. This article proposes and verifies a procedure for aDNA recovery from historical parchments and its appropriate preparation for further analyses. This study involved experimental selection of an aDNA extraction method with the highest efficiency and quality of extracted genetic material, from among the multi-stage phenol-chloroform extraction methods, and the modern, column-based techniques that use selective DNA-binding membranes. Moreover, current techniques to amplify entire genetic material were questioned, and the possibility of using mitochondrial DNA for species identification was analyzed. The usefulness of the proposed procedure was successfully confirmed in identification tests of historical parchments dating back to the 13-16th century AD.

  20. A new user-assisted segmentation and tracking technique for an object-based video editing system

    NASA Astrophysics Data System (ADS)

    Yu, Hong Y.; Hong, Sung-Hoon; Lee, Mike M.; Choi, Jae-Gark

    2004-03-01

    This paper presents a semi-automatic segmentation method which can be used to generate video object plane (VOP) for object based coding scheme and multimedia authoring environment. Semi-automatic segmentation can be considered as a user-assisted segmentation technique. A user can initially mark objects of interest around the object boundaries and then the user-guided and selected objects are continuously separated from the unselected areas through time evolution in the image sequences. The proposed segmentation method consists of two processing steps: partially manual intra-frame segmentation and fully automatic inter-frame segmentation. The intra-frame segmentation incorporates user-assistance to define the meaningful complete visual object of interest to be segmentation and decides precise object boundary. The inter-frame segmentation involves boundary and region tracking to obtain temporal coherence of moving object based on the object boundary information of previous frame. The proposed method shows stable efficient results that could be suitable for many digital video applications such as multimedia contents authoring, content based coding and indexing. Based on these results, we have developed objects based video editing system with several convenient editing functions.

  1. Tracking Organs Composed of One or Multiple Regions Using Geodesic Active Region Models

    NASA Astrophysics Data System (ADS)

    Martínez, A.; Jiménez, J. J.

    In radiotherapy treatment it is very important to find out the target organs on the medical image sequence in order to determine and apply the proper dose. The techniques to achieve this goal can be classified into extrinsic and intrinsic. Intrinsic techniques only use image processing with medical images associated to the radiotherapy Radiotherapy treatment, as we deal in this chapter. To accurately perform this organ tracking it is necessary to find out segmentation and tracking models that were able to be applied to several image modalities involved on a radiotherapy session (CT CT See Modality , MRI Magnetic resoance imaging , etc.). The movements of the organs are mainly affected by two factors: breathing and involuntary movements associated with the internal organs or patient positioning. Among the several alternatives to track the organs of interest, a model based on geodesic active regions is proposed. This model has been tested over CT Computed tomography images from the pelvic, cardiac, and thoracic area. A new model for the segmentation of organs composed by more than one region is proposed.

  2. Testing the TPF Interferometry Approach before Launch

    NASA Technical Reports Server (NTRS)

    Serabyn, Eugene; Mennesson, Bertrand

    2006-01-01

    One way to directly detect nearby extra-solar planets is via their thermal infrared emission, and with this goal in mind, both NASA and ESA are investigating cryogenic infrared interferometers. Common to both agencies' approaches to faint off-axis source detection near bright stars is the use of a rotating nulling interferometer, such as the Terrestrial Planet Finder interferometer (TPF-I), or Darwin. In this approach, the central star is nulled, while the emission from off-axis sources is transmitted and modulated by the rotation of the off-axis fringes. Because of the high contrasts involved, and the novelty of the measurement technique, it is essential to gain experience with this technique before launch. Here we describe a simple ground-based experiment that can test the essential aspects of the TPF signal measurement and image reconstruction approaches by generating a rotating interferometric baseline within the pupil of a large singleaperture telescope. This approach can mimic potential space-based interferometric configurations, and allow the extraction of signals from off-axis sources using the same algorithms proposed for the space-based missions. This approach should thus allow for testing of the applicability of proposed signal extraction algorithms for the detection of single and multiple near-neighbor companions...

  3. Microbial production and chemical transformation of poly-γ-glutamate.

    PubMed

    Ashiuchi, Makoto

    2013-11-01

    Poly-γ-glutamate (PGA), a novel polyamide material with industrial applications, possesses a nylon-like backbone, is structurally similar to polyacrylic acid, is biodegradable and is safe for human consumption. PGA is frequently found in the mucilage of natto, a Japanese traditional fermented food. To date, three different types of PGA, namely a homo polymer of D-glutamate (D-PGA), a homo polymer of L-glutamate (L-PGA), and a random copolymer consisting of D- and L-glutamate (DL-PGA), are known. This review will detail the occurrence and physiology of PGA. The proposed reaction mechanism of PGA synthesis including its localization and the structure of the involved enzyme, PGA synthetase, are described. The occurrence of multiple carboxyl residues in PGA likely plays a role in its relative unsuitability for the development of bio-nylon plastics and thus, establishment of an efficient PGA-reforming strategy is of great importance. Aside from the potential applications of PGA proposed to date, a new technique for chemical transformation of PGA is also discussed. Finally, some techniques for PGA and its derivatives in advanced material technology are presented. © 2013 The Author. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  4. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  5. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh; Vetter, Jeffrey S.

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  6. Thermal management of VECSELs by front surface direct liquid cooling

    NASA Astrophysics Data System (ADS)

    Smyth, Conor J. C.; Mirkhanov, Shamil; Quarterman, Adrian H.; Wilcox, Keith G.

    2016-03-01

    Efficient thermal management is vital for VECSELs, affecting the output power and several aspects of performance of the device. Presently there exist two distinct methods of effective thermal management which both possess their merits and disadvantages. Substrate removal of the VECSEL gain chip has proved a successful method in devices emitting at a wavelength near 1μm. However for other wavelengths the substrate removal technique has proved less effective primarily due to the thermal impedance of the distributed Bragg reflectors. The second method of thermal management involves the use of crystalline heat spreaders bonded to the gain chip surface. Although this is an effective thermal management scheme, the disadvantages are additional loss and the etalon effect that filters the gain spectrum, making mode locking more difficult and normally resulting in multiple peaks in the spectrum. There are considerable disadvantages associated with both methods attributed to heatspreader cost and sample processing. It is for these reasons that a proposed alternative, front surface liquid cooling, has been investigated in this project. Direct liquid cooling involves flowing a temperature-controlled liquid over the sample's surface. In this project COMSOL was used to model surface liquid cooling of a VECSEL sample in order to investigate and compare its potential thermal management with current standard thermal management techniques. Based on modelling, experiments were carried out in order to evaluate the performance of the technique. While modelling suggests that this is potentially a mid-performance low cost alternative to existing techniques, experimental measurements to date do not reflect the performance predicted from modelling.

  7. Demonstration of pelvic anatomy by modified midline transection that maintains intact internal pelvic organs.

    PubMed

    Steinke, Hanno; Saito, Toshiyuki; Herrmann, Gudrun; Miyaki, Takayoshi; Hammer, Niels; Sandrock, Mara; Itoh, Masahiro; Spanel-Borowski, Katharina

    2010-01-01

    Gross dissection for demonstrating anatomy of the human pelvis has traditionally involved one of two approaches, each with advantages and disadvantages. Classic hemisection in the median plane through the pelvic ring transects the visceral organs but maintains two symmetric pelvic halves. An alternative paramedial transection compromises one side of the bony pelvis but leaves the internal organs intact. The authors propose a modified technique that combines advantages of both classical dissections. This novel approach involves dividing the pubic symphysis and sacrum in the median plane after shifting all internal organs to one side. The hemipelvis without internal organs is immediately available for further dissection of the lower limb. The hemipelvis with intact internal organs is ideal for showing the complex spatial relationships of the pelvic organs and vessels relative to the intact pelvic floor.

  8. A rapid boundary integral equation technique for protein electrostatics

    NASA Astrophysics Data System (ADS)

    Grandison, Scott; Penfold, Robert; Vanden-Broeck, Jean-Marc

    2007-06-01

    A new boundary integral formulation is proposed for the solution of electrostatic field problems involving piecewise uniform dielectric continua. Direct Coulomb contributions to the total potential are treated exactly and Green's theorem is applied only to the residual reaction field generated by surface polarisation charge induced at dielectric boundaries. The implementation shows significantly improved numerical stability over alternative schemes involving the total field or its surface normal derivatives. Although strictly respecting the electrostatic boundary conditions, the partitioned scheme does introduce a jump artefact at the interface. Comparison against analytic results in canonical geometries, however, demonstrates that simple interpolation near the boundary is a cheap and effective way to circumvent this characteristic in typical applications. The new scheme is tested in a naive model to successfully predict the ground state orientation of biomolecular aggregates comprising the soybean storage protein, glycinin.

  9. Cerebellar interaction with the acoustic reflex.

    PubMed

    Jastreboff, P J

    1981-01-01

    The involvement of the cerebellar vermis in the acoustic reflex was analyzed in 12 cats, decerebrated or in pentobarbital anesthesia. Anatomical data suggested the existence of a connection of lobules VIII with the ventral cochlear nucleus. Single cell recording and evoked potential techniques demonstrated the existence of the acoustic projection to lobulus VIII. Electrical stimulation of this area changed the tension of the middle ear muscle and caused evoked potential responses in the caudal part of the ventral cochlear nucleus. Electrical stimulation of the motor nucleus of the facial nerve evoked a slow wave in the recording taken from the surrounding of the cochlear round window. A hypothesis is proposed which postulates the involvement of the acoustic reflex in space localization of acoustic stimuli and the action of cerebellar vermis in order to assure the stability and plasticity of the acoustic reflex arc.

  10. A novel thermal management system for improving discharge/charge performance of Li-ion battery packs under abuse

    NASA Astrophysics Data System (ADS)

    Arora, Shashank; Kapoor, Ajay; Shen, Weixiang

    2018-02-01

    Parasitic load, which describes electrical energy consumed by battery thermal management system (TMS), is an important design criterion for battery packs. Passive TMSs using phase change materials (PCMs) are thus generating much interest. However, PCMs suffer from low thermal conductivities. Most current thermal conductivity enhancement techniques involve addition of foreign particles to PCMs. Adding foreign particles increases effective thermal conductivity of PCM-systems but at expense of their latent heat capacity. This paper presents an alternate approach for improving thermal performance of PCM-based TMSs. The introduced technique involves placing battery cells in a vertically inverted position within the battery-pack. It is demonstrated through experiments that inverted cell-layout facilitates build-up of convection current in the pack, which in turn minimises thermal variations within the PCM matrix by enabling PCM mass transfer between the top and the bottom regions of the battery pack. The proposed system is found capable of maintaining tight control over battery cell temperature even during abusive usage, defined as high-rate repetitive cycling with minimal rest periods. In addition, this novel TMS can recover waste heat from PCM-matrix through thermoelectric devices, thereby resulting in a negative parasitic load for TMS.

  11. Development of Mobile Mapping System for 3D Road Asset Inventory.

    PubMed

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-03-12

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed.

  12. Integrated HTA-FMEA/FMECA methodology for the evaluation of robotic system in urology and general surgery.

    PubMed

    Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea

    2016-11-14

    The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.

  13. Development of Mobile Mapping System for 3D Road Asset Inventory

    PubMed Central

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-01-01

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed. PMID:26985897

  14. Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation

    NASA Astrophysics Data System (ADS)

    Sleesongsom, S.; Bureerat, S.

    2018-03-01

    This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.

  15. Low-dose CT image reconstruction using gain intervention-based dictionary learning

    NASA Astrophysics Data System (ADS)

    Pathak, Yadunath; Arya, K. V.; Tiwari, Shailendra

    2018-05-01

    Computed tomography (CT) approach is extensively utilized in clinical diagnoses. However, X-ray residue in human body may introduce somatic damage such as cancer. Owing to radiation risk, research has focused on the radiation exposure distributed to patients through CT investigations. Therefore, low-dose CT has become a significant research area. Many researchers have proposed different low-dose CT reconstruction techniques. But, these techniques suffer from various issues such as over smoothing, artifacts, noise, etc. Therefore, in this paper, we have proposed a novel integrated low-dose CT reconstruction technique. The proposed technique utilizes global dictionary-based statistical iterative reconstruction (GDSIR) and adaptive dictionary-based statistical iterative reconstruction (ADSIR)-based reconstruction techniques. In case the dictionary (D) is predetermined, then GDSIR can be used and if D is adaptively defined then ADSIR is appropriate choice. The gain intervention-based filter is also used as a post-processing technique for removing the artifacts from low-dose CT reconstructed images. Experiments have been done by considering the proposed and other low-dose CT reconstruction techniques on well-known benchmark CT images. Extensive experiments have shown that the proposed technique outperforms the available approaches.

  16. Charge transfer process at the Ag/MPH/TiO2 interface by SERS: alignment of the Fermi level.

    PubMed

    Zhang, Xiaolei; Sui, Huimin; Wang, Xiaolei; Su, Hongyang; Cheng, Weina; Wang, Xu; Zhao, Bing

    2016-11-02

    A nanoscale metal-molecule-semiconductor assembly (Ag/4-mercaptophenol/TiO 2 ) has been fabricated over Au nanoparticle (NP) films as a model to study the interfacial charge transfer (CT) effects involved in Ag/MPH/TiO 2 . Due to the interaction between Au NPs and Ag NPs, some distinct differences occur in the SERS spectra. We also measured the SERS of Ag/MPH (4-mercaptophenol), Ag/MPH/TiO 2 , and Au/Ag/MPH/TiO 2 assemblies at excitation wavelengths of 477, 514, 532, 633, and 785 nm. We found that the changes in the CT process, caused by the introduction of TiO 2 and Au, can be reflected in SERS. Then in combination with other detection methods, we proposed a possible CT process involved in the Ag/MPH, Ag/MPH/TiO 2 , and Au/Ag/MPH/TiO 2 assemblies. A Pt/Ag/MPH/TiO 2 assembly was also constructed to verify our proposed CT mechanism. This work not only provides more details about CT between metal-molecule-semiconductor interfaces but also aids in constructing nanoscale models to study interfacial problems with the SERS technique.

  17. Exploring the simulation requirements for virtual regional anesthesia training

    NASA Astrophysics Data System (ADS)

    Charissis, V.; Zimmer, C. R.; Sakellariou, S.; Chan, W.

    2010-01-01

    This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.

  18. Strategies for an enzyme immobilization on electrodes: Structural and electrochemical characterizations

    NASA Astrophysics Data System (ADS)

    Ganesh, V.; Muthurasu, A.

    2012-04-01

    In this paper, we propose various strategies for an enzyme immobilization on electrodes (both metal and semiconductor electrodes). In general, the proposed methodology involves two critical steps viz., (1) chemical modification of substrates using functional monolayers [Langmuir - Blodgett (LB) films and/or self-assembled monolayers (SAMs)] and (2) anchoring of a target enzyme using specific chemical and physical interactions by attacking the terminal functionality of the modified films. Basically there are three ways to immobilize an enzyme on chemically modified electrodes. First method consists of an electrostatic interaction between the enzyme and terminal functional groups present within the chemically modified films. Second and third methods involve the introduction of nanomaterials followed by an enzyme immobilization using both the physical and chemical adsorption processes. As a proof of principle, in this work we demonstrate the sensing and catalytic activity of horseradish peroxidase (HRP) anchored onto SAM modified indium tin oxide (ITO) electrodes towards hydrogen peroxide (H2O2). Structural characterization of such modified electrodes is performed using X-ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM) and contact angle measurements. The binding events and the enzymatic reactions are monitored using electrochemical techniques mainly cyclic voltammetry (CV).

  19. Sentence processing in the cerebral cortex.

    PubMed

    Sakai, K L; Hashimoto, R; Homae, F

    2001-01-01

    Human language is a unique faculty of the mind. It has been the ultimate mystery throughout the history of neuroscience. Despite many aphasia and functional imaging studies, the exact correlation between cortical language areas and subcomponents of the linguistic system has not been established. One notable drawback is that most functional imaging studies have tested language tasks at the word level, such as lexical decision and word generation tasks, thereby neglecting the syntactic aspects of the language faculty. As proposed by Chomsky, the critical knowledge of language involves universal grammar (UG), which governs the syntactic structure of sentences. In this article, we will review recent advances made by functional neuroimaging studies of language, focusing especially on sentence processing in the cerebral cortex. We also present the recent results of our functional magnetic resonance imaging (fMRI) study intended to identify cortical areas specifically involved in syntactic processing. A study of sentence processing that employs a newly developed technique, optical topography (OT), is also presented. Based on these findings, we propose a modular specialization of Broca's area, Wernicke's area, and the angular gyrus/supramarginal gyrus. The current direction of research in neuroscience is beginning to establish the existence of distinct modules responsible for our knowledge of language.

  20. Coalescence computations for large samples drawn from populations of time-varying sizes

    PubMed Central

    Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek

    2017-01-01

    We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404

  1. Far infrared polarizing grids for use at cryogenic temperatures

    NASA Technical Reports Server (NTRS)

    Novak, Giles; Sundwall, Jeffrey L.; Pernic, Robert J.

    1989-01-01

    A technique is proposed for the construction of free-standing wire grids for use as far-IR polarizers. The method involves wrapping a strand of wire around a single cylinder rather than around a pair of parallel rods, thus simplifying the problem of maintaining constant wire tension. The cylinder is composed of three separate pieces which are disassembled at a later stage in the grid-making process. Grids have been constructed using 8-micron-diameter stainless steel wire and a grid spacing of 25 microns. The grids are shown to be reliable under repeated cycling between room temperature and 1.5 K.

  2. Towards sound epistemological foundations of statistical methods for high-dimensional biology.

    PubMed

    Mehta, Tapan; Tanik, Murat; Allison, David B

    2004-09-01

    A sound epistemological foundation for biological inquiry comes, in part, from application of valid statistical procedures. This tenet is widely appreciated by scientists studying the new realm of high-dimensional biology, or 'omic' research, which involves multiplicity at unprecedented scales. Many papers aimed at the high-dimensional biology community describe the development or application of statistical techniques. The validity of many of these is questionable, and a shared understanding about the epistemological foundations of the statistical methods themselves seems to be lacking. Here we offer a framework in which the epistemological foundation of proposed statistical methods can be evaluated.

  3. Optimizing Input/Output Using Adaptive File System Policies

    NASA Technical Reports Server (NTRS)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  4. Independent Component Analysis of Textures

    NASA Technical Reports Server (NTRS)

    Manduchi, Roberto; Portilla, Javier

    2000-01-01

    A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.

  5. Physical Modeling of Microtubules Network

    NASA Astrophysics Data System (ADS)

    Allain, Pierre; Kervrann, Charles

    2014-10-01

    Microtubules (MT) are highly dynamic tubulin polymers that are involved in many cellular processes such as mitosis, intracellular cell organization and vesicular transport. Nevertheless, the modeling of cytoskeleton and MT dynamics based on physical properties is difficult to achieve. Using the Euler-Bernoulli beam theory, we propose to model the rigidity of microtubules on a physical basis using forces, mass and acceleration. In addition, we link microtubules growth and shrinkage to the presence of molecules (e.g. GTP-tubulin) in the cytosol. The overall model enables linking cytosol to microtubules dynamics in a constant state space thus allowing usage of data assimilation techniques.

  6. Percutaneous self-injury to the femoral region caused by bur breakage during surgical extraction of a patient's impacted third molar.

    PubMed

    Yu, Tae Hoon; Lee, Jun; Kim, Bong Chul

    2015-10-01

    Extraction of an impacted third molar is one of the most frequently performed techniques in oral and maxillofacial surgery. Surgeons can suffer numerous external injuries while extracting a tooth, with percutaneous injuries to the hand being the most commonly reported. In this article, we present a case involving a percutaneous injury of the surgeon's femoral region caused by breakage of the fissure bur connected to the handpiece during extraction of the third molar. We also propose precautions to prevent such injuries and steps to be undertaken when they occur.

  7. Corrosion-Indicating Pigment And Probes

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Bugga, Ratnakumar V.; Attia, Alan I.

    1993-01-01

    Proposed hydrogen-sensitive paint for metal structures changes color at onset of corrosion, involving emission of hydrogen as result of electrochemical reactions. Pigment of suitable paint includes rhodium compound RhCl(PPh3)3, known as Wilkinson's catalyst. As coating on critical parts of such structures as bridges and aircraft, paint gives early warning of corrosion, and parts thus repaired or replaced before failing catastrophically. Reveals corrosion before it becomes visible to eye. Inspection for changes in color not ordinarily necessitate removal of structure from service, and costs less than inspection by x-ray or thermal neutron radiography, ultrasonic, eddy-current, or acoustic-emission techniques.

  8. Reflections and meditations upon complex chromosomal exchanges.

    PubMed

    Savage, John R K

    2002-12-01

    The application of FISH chromosome painting techniques, especially the recent mFISH (and its equivalents) where all 23 human chromosome pairs can be distinguished, has demonstrated that many chromosome-type structural exchanges are much more complicated (involving more "break-rejoins" and arms) than has hitherto been assumed. It is clear that we have been greatly under-estimating the damage produced in chromatin by such agents as ionising radiation. This article gives a brief historical summary of observations leading up to this conclusion, and after outlining some of the problems surrounding the formation of complex chromosomes exchanges, speculates about possible solutions currently being proposed.

  9. The bridge technique for pectus bar fixation: a method to make the bar un-rotatable.

    PubMed

    Park, Hyung Joo; Kim, Kyung Soo; Moon, Young Kyu; Lee, Sungsoo

    2015-08-01

    Pectus bar rotation is a major challenge in pectus repair. However, to date, no satisfactory technique to completely eliminate bar displacement has been introduced. Here, we propose a bar fixation technique using a bridge that makes the bar unmovable. The purpose of this study was to determine the efficacy of this bridge technique. A total of 80 patients underwent pectus bar repair of pectus excavatum with the bridge technique from July 2013 to July 2014. The technique involved connecting 2 parallel bars using plate-screws at the ends of the bars. To determine bar position change, the angles between the sternum and pectus bars were measured on postoperative day 5 (POD5) and 4 months (POM4) and compared. The mean patient age was 17.5 years (range, 6-38 years). The mean difference between POD5 and POM4 were 0.23° (P=.602) and 0.35° (P=.338) for the upper and lower bars, respectively. Bar position was virtually unchanged during the follow-up, and there was no bar dislocation or reoperation. A "bridge technique" designed to connect 2 parallel bars using plates and screws was demonstrated as a method to avoid pectus bar displacement. This approach was easy to implement without using sutures or invasive devices. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. The modified Misgav-Ladach versus the Pfannenstiel-Kerr technique for cesarean section: a randomized trial.

    PubMed

    Xavier, Pedro; Ayres-De-Campos, Diogo; Reynolds, Ana; Guimarães, Mariana; Costa-Santos, Cristina; Patrício, Belmiro

    2005-09-01

    Modifications to the classic cesarean section technique described by Pfannenstiel and Kerr have been proposed in the last few years. The objective of this trial was to compare intraoperative and short-term postoperative outcomes between the Pfannenstiel-Kerr and the modified Misgav-Ladach (MML) techniques for cesarean section. This prospective randomized trial involved 162 patients undergoing transverse lower uterine segment cesarean section. Patients were allocated to one of the two arms: 88 to the MML technique and 74 to the Pfannenstiel-Kerr technique. Main outcome measures were defined as the duration of surgery, analgesic requirements, and bowel restitution by the second postoperative day. Additional outcomes evaluated were febrile morbidity, postoperative antibiotic use, postpartum endometritis, and wound complications. Student's t, Mann-Whitney, and Chi-square tests were used for statistical analysis of the results, and a p < 0.05 was considered as the probability level reflecting significant differences. No differences between groups were noted in the incidence of analgesic requirements, bowel restitution by the second postoperative day, febrile morbidity, antibiotic requirements, endometritis, or wound complications. The MML technique took on average 12 min less to complete (p = 0.001). The MML technique is faster to perform and similar in terms of febrile morbidity, time to bowel restitution, or need for postoperative medications. It is likely to be more cost-effective.

  11. Optimal neighborhood indexing for protein similarity search.

    PubMed

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-12-16

    Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.

  12. Optimal neighborhood indexing for protein similarity search

    PubMed Central

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-01-01

    Background Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. Results The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website . Conclusion We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction. PMID:19087280

  13. Response prediction techniques and case studies of a path blocking system based on Global Transmissibility Direct Transmissibility method

    NASA Astrophysics Data System (ADS)

    Wang, Zengwei; Zhu, Ping; Zhao, Jianxuan

    2017-02-01

    In this paper, the prediction capabilities of the Global Transmissibility Direct Transmissibility (GTDT) method are further developed. Two path blocking techniques solely using the easily measured variables of the original system to predict the response of a path blocking system are generalized to finite element models of continuous systems. The proposed techniques are derived theoretically in a general form for the scenarios of setting the response of a subsystem to zero and of removing the link between two directly connected subsystems. The objective of this paper is to verify the reliability of the proposed techniques by finite element simulations. Two typical cases, the structural vibration transmission case and the structure-borne sound case, in two different configurations are employed to illustrate the validity of proposed techniques. The points of attention for each case have been discussed, and conclusions are given. It is shown that for the two cases of blocking a subsystem the proposed techniques are able to predict the new response using measured variables of the original system, even though operational forces are unknown. For the structural vibration transmission case of removing a connector between two components, the proposed techniques are available only when the rotational component responses of the connector are very small. The proposed techniques offer relative path measures and provide an alternative way to deal with NVH problems. The work in this paper provides guidance and reference for the engineering application of the GTDT prediction techniques.

  14. An Investigation of Proposed Techniques for Quantifying Confidence in Assurance Arguments

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2016-01-01

    The use of safety cases in certification raises the question of assurance argument sufficiency and the issue of confidence (or uncertainty) in the argument's claims. Some researchers propose to model confidence quantitatively and to calculate confidence in argument conclusions. We know of little evidence to suggest that any proposed technique would deliver trustworthy results when implemented by system safety practitioners. Proponents do not usually assess the efficacy of their techniques through controlled experiment or historical study. Instead, they present an illustrative example where the calculation delivers a plausible result. In this paper, we review current proposals, claims made about them, and evidence advanced in favor of them. We then show that proposed techniques can deliver implausible results in some cases. We conclude that quantitative confidence techniques require further validation before they should be recommended as part of the basis for deciding whether an assurance argument justifies fielding a critical system.

  15. Supervised multiblock sparse multivariable analysis with application to multimodal brain imaging genetics.

    PubMed

    Kawaguchi, Atsushi; Yamashita, Fumio

    2017-10-01

    This article proposes a procedure for describing the relationship between high-dimensional data sets, such as multimodal brain images and genetic data. We propose a supervised technique to incorporate the clinical outcome to determine a score, which is a linear combination of variables with hieratical structures to multimodalities. This approach is expected to obtain interpretable and predictive scores. The proposed method was applied to a study of Alzheimer's disease (AD). We propose a diagnostic method for AD that involves using whole-brain magnetic resonance imaging (MRI) and positron emission tomography (PET), and we select effective brain regions for the diagnostic probability and investigate the genome-wide association with the regions using single nucleotide polymorphisms (SNPs). The two-step dimension reduction method, which we previously introduced, was considered applicable to such a study and allows us to partially incorporate the proposed method. We show that the proposed method offers classification functions with feasibility and reasonable prediction accuracy based on the receiver operating characteristic (ROC) analysis and reasonable regions of the brain and genomes. Our simulation study based on the synthetic structured data set showed that the proposed method outperformed the original method and provided the characteristic for the supervised feature. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Chromatic changes to artificial irises produced using different techniques

    NASA Astrophysics Data System (ADS)

    Bannwart, Lisiane Cristina; Goiato, Marcelo Coelho; dos Santos, Daniela Micheline; Moreno, Amália; Pesqueira, Aldiéris Alves; Haddad, Marcela Filié; Andreotti, Agda Marobo; de Medeiros, Rodrigo Antonio

    2013-05-01

    Ocular prostheses are important determinants of their users' aesthetic recovery and self-esteem. Because of use, ocular prostheses longevity is strongly affected by instability of the iris color due to polymerization. The goal of this study is to examine how the color of the artificial iris button is affected by different techniques of artificial wear and by the application of varnish following polymerization of the colorless acrylic resin that covers the colored paint. We produce 60 samples (n=10) according to the wear technique applied: conventional technique without varnish (PE); conventional technique with varnish (PEV); technique involving a prefabricated cap without varnish (CA); technique involving a prefabricated cap with varnish (CAV); technique involving inverted painting without varnish (PI); and technique involving inverted painting with varnish (PIV). Color readings using a spectrophotometer are taken before and after polymerization. We submitted the data obtained to analyses of variance and Tukey's test (P<0.05). The color test shows significant changes after polymerization in all groups. The PE and PI techniques have clinically acceptable values of ΔE, independent of whether we apply varnish to protect the paint. The PI technique produces the least color change, whereas the PE and CA techniques significantly improve color stability.

  17. Performance of dual inverter fed open end winding induction motor drive using carrier shift PWM techniques

    NASA Astrophysics Data System (ADS)

    Priya Darshini, B.; Ranjit, M.; Babu, V. Ramesh

    2018-04-01

    In this paper different Multicarrier PWM (MCPWM) techniques are proposed for dual inverter fed open end induction motor (IM) drive to achieve multilevel operation. To generate the switching pulses for the dual inverter sinusoidal modulating signal is compared with multi carrier signals. A common mode voltage (CMV) has been analyzed in the proposed open end winding induction motor drive. All the proposed techniques mitigate the CMV along with the harmonic distortion in the phase voltage. To authenticate the proposed work several simulation techniques have been carried out using MATLAB/SIMULINK and the corresponding results are presented and compared.

  18. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  19. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  20. Numerical model updating technique for structures using firefly algorithm

    NASA Astrophysics Data System (ADS)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  1. A ROle-Oriented Filtering (ROOF) approach for collaborative recommendation

    NASA Astrophysics Data System (ADS)

    Ghani, Imran; Jeong, Seung Ryul

    2016-09-01

    In collaborative filtering (CF) recommender systems, existing techniques frequently focus on determining similarities among users' historical interests. This generally refers to situations in which each user normally plays a single role and his/her taste remains consistent over the long term. However, we note that existing techniques have not been significantly employed in a role-oriented context. This is especially so in situations where users may change their roles over time or play multiple roles simultaneously, while still expecting to access relevant information resources accordingly. Such systems include enterprise architecture management systems, e-commerce sites or journal management systems. In scenarios involving existing techniques, each user needs to build up very different profiles (preferences and interests) based on multiple roles which change over time. Should this not occur to a satisfactory degree, their previous information will either be lost or not utilised at all. To limit the occurrence of such issues, we propose a ROle-Oriented Filtering (ROOF) approach focusing on the manner in which multiple user profiles are obtained and maintained over time. We conducted a number of experiments using an enterprise architecture management scenario. In so doing, we observed that the ROOF approach performs better in comparison with other existing collaborative filtering-based techniques.

  2. Efficient use of mobile devices for quantification of pressure injury images.

    PubMed

    Garcia-Zapirain, Begonya; Sierra-Sosa, Daniel; Ortiz, David; Isaza-Monsalve, Mariano; Elmaghraby, Adel

    2018-01-01

    Pressure Injuries are chronic wounds that are formed due to the constriction of the soft tissues against bone prominences. In order to assess these injuries, the medical personnel carry out the evaluation and diagnosis using visual methods and manual measurements, which can be inaccurate and may generate discomfort in the patients. By using segmentation techniques, the Pressure Injuries can be extracted from an image and accurately parameterized, leading to a correct diagnosis. In general, these techniques are based on the solution of differential equations and the involved numerical methods are demanding in terms of computational resources. In previous work, we proposed a technique developed using toroidal parametric equations for image decomposition and segmentation without solving differential equations. In this paper, we present the development of a mobile application useful for the non-contact assessment of Pressure Injuries based on the toroidal decomposition from images. The usage of this technique allows us to achieve an accurate segmentation almost 8 times faster than Active Contours without Edges (ACWE) and Dynamic Contours methods. We describe the techniques and the implementation for Android devices using Python and Kivy. This application allows for the segmentation and parameterization of injuries, obtain relevant information for the diagnosis and tracking the evolution of patient's injuries.

  3. Growth and surface analysis of SiO2 on 4H-SiC for MOS devices

    NASA Astrophysics Data System (ADS)

    Kodigala, Subba Ramaiah; Chattopadhyay, Somnath; Overton, Charles; Ardoin, Ira; Gordon, B. J.; Johnstone, D.; Roy, D.; Barone, D.

    2015-03-01

    The SiO2 layers have been grown onto C-face and Si-face 4H-SiC substrates by two different techniques such as wet thermal oxidize process and sputtering. The deposition recipes of these techniques are carefully optimized by trails and error method. The growth effects of SiO2 on the C-face and Si-face 4H-SiC substrates are thoroughly investigated by AFM analysis. The growth mechanism of different species involved in the growth process of SiO2 by wet thermal oxide is now proposed by adopting two body classical projectile scattering. This mechanism drives to determine growth of secondary phases such as α-CH nano-islands in the grown SiO2 layer. The effect of HF etchings on the SiO2 layers grown by both techniques and on both the C-face and Si-face substrates are legitimately studied. The thicknesses of the layers determined by AFM and ellipsometry techniques are widely promulgated. The MOS capacitors are made on the Si-face 4H-SiC wafers by wet oxidation and sputtering processes, which are studied by capacitance versus voltage (CV) technique. From CV measurements, the density of trap states with variation of trap level for MOS devices is estimated.

  4. Jacobian projection reduced-order models for dynamic systems with contact nonlinearities

    NASA Astrophysics Data System (ADS)

    Gastaldi, Chiara; Zucca, Stefano; Epureanu, Bogdan I.

    2018-02-01

    In structural dynamics, the prediction of the response of systems with localized nonlinearities, such as friction dampers, is of particular interest. This task becomes especially cumbersome when high-resolution finite element models are used. While state-of-the-art techniques such as Craig-Bampton component mode synthesis are employed to generate reduced order models, the interface (nonlinear) degrees of freedom must still be solved in-full. For this reason, a new generation of specialized techniques capable of reducing linear and nonlinear degrees of freedom alike is emerging. This paper proposes a new technique that exploits spatial correlations in the dynamics to compute a reduction basis. The basis is composed of a set of vectors obtained using the Jacobian of partial derivatives of the contact forces with respect to nodal displacements. These basis vectors correspond to specifically chosen boundary conditions at the contacts over one cycle of vibration. The technique is shown to be effective in the reduction of several models studied using multiple harmonics with a coupled static solution. In addition, this paper addresses another challenge common to all reduction techniques: it presents and validates a novel a posteriori error estimate capable of evaluating the quality of the reduced-order solution without involving a comparison with the full-order solution.

  5. Development of metrology for freeform optics in reflection mode

    NASA Astrophysics Data System (ADS)

    Burada, Dali R.; Pant, Kamal K.; Mishra, Vinod; Bichra, Mohamed; Khan, Gufran S.; Sinzinger, Stefan; Shakher, Chandra

    2017-06-01

    The increased range of manufacturable freeform surfaces offered by the new fabrication techniques is giving opportunities to incorporate them in the optical systems. However, the success of these fabrication techniques depends on the capabilities of metrology procedures and a feedback mechanism to CNC machines for optimizing the manufacturing process. Therefore, a precise and in-situ metrology technique for freeform optics is in demand. Though all the techniques available for aspheres have been extended for the freeform surfaces by the researchers, but none of the techniques has yet been incorporated into the manufacturing machine for in-situ measurement. The most obvious reason is the complexity involved in the optical setups to be integrated in the manufacturing platforms. The Shack-Hartmann sensor offers the potential to be incorporated into the machine environment due to its vibration insensitivity, compactness and 3D shape measurement capability from slope data. In the present work, a measurement scheme is reported in which a scanning Shack-Hartmann Sensor has been employed and used as a metrology tool for measurement of freeform surface in reflection mode. Simulation studies are conducted for analyzing the stitching accuracy in presence of various misalignment errors. The proposed scheme is experimentally verified on a freeform surface of cubic phase profile.

  6. Application of isotope dilution technique in vitamin A nutrition.

    PubMed

    Wasantwisut, Emorn

    2002-09-01

    The isotope dilution technique involving deuterated retinol has been developed to quantitatively estimate total body reserves of vitamin A in humans. The technique provided good estimates in comparison to hepatic vitamin A concentrations in Bangladeshi surgical patients. Kinetic studies in the United States, Bangladesh, and Guatemala indicated the mean equilibration time of 17 to 20 days irrespective of the size of hepatic reserves. Due to the controversy surrounding the efficacy of a carotene-rich diet on improvement of vitamin A status, the isotope dilution technique was proposed to pursue this research question further (IAEA's coordinated research program). In the Philippines, schoolchildren with low serum retinol concentrations showed significant improvement in total body vitamin A stores following intake of carotene-rich foods (orange fruits and vegetables), using a three-day deuterated-retinol-dilution procedure. When Chinese kindergarten children were fed green and yellow vegetables during the winter, their total body vitamin A stores were sustained as compared to a steady decline of vitamin A stores in the control children. Likewise, daily consumption of purified beta-carotene or diet rich in provitamin A carotenoids were shown to prevent a loss in total body vitamin A stores among Thai lactating women during the rice-planting season. These studies demonstrate potentials of the isotope dilution technique to evaluate the impact of provitamin A carotenoid intervention programs.

  7. A new measure for gene expression biclustering based on non-parametric correlation.

    PubMed

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. An Indirect Method for Vapor Pressure and Phase Change Enthalpy Determination by Thermogravimetry

    NASA Astrophysics Data System (ADS)

    Giani, Samuele; Riesen, Rudolf; Schawe, Jürgen E. K.

    2018-07-01

    Vapor pressure is a fundamental property of a pure substance. This property is the pressure of a compound's vapor in thermodynamic equilibrium with its condensed phase (solid or liquid). When phase equilibrium condition is met, phase coexistence of a pure substance involves a continuum interplay of vaporization or sublimation to gas and condensation back to their liquid or solid form, respectively. Thermogravimetric analysis (TGA) techniques are based on mass loss determination and are well suited for the study of such phenomena. In this work, it is shown that TGA method using a reference substance is a suitable technique for vapor pressure determination. This method is easy and fast because it involves a series of isothermal segments. In contrast to original Knudsen's approach, where the use of high vacuum is mandatory, adopting the proposed method a given experimental setup is calibrated under ambient pressure conditions. The theoretical framework of this method is based on a generalization of Langmuir equation of free evaporation: The real strength of the proposed method is the ability to determine the vapor pressure independently of the molecular mass of the vapor. A demonstration of this method has been performed using the Clausius-Clapeyron equation of state to derive the working equation. This algorithm, however, is adaptive and admits the use of other equations of state. The results of a series of experiments with organic molecules indicate that the average difference of the measured and the literature vapor pressure amounts to about 5 %. Vapor pressure determined in this study spans from few mPa up to several kPa. Once the p versus T diagram is obtained, phase transition enthalpy can additionally be calculated from the data.

  9. Modeling and comparative study of linear and nonlinear controllers for rotary inverted pendulum

    NASA Astrophysics Data System (ADS)

    Lima, Byron; Cajo, Ricardo; Huilcapi, Víctor; Agila, Wilton

    2017-01-01

    The rotary inverted pendulum (RIP) is a problem difficult to control, several studies have been conducted where different control techniques have been applied. Literature reports that, although problem is nonlinear, classical PID controllers presents appropriate performances when applied to the system. In this paper, a comparative study of the performances of linear and nonlinear PID structures is carried out. The control algorithms are evaluated in the RIP system, using indices of performance and power consumption, which allow the categorization of control strategies according to their performance. This article also presents the modeling system, which has been estimated some of the parameters involved in the RIP system, using computer-aided design tools (CAD) and experimental methods or techniques proposed by several authors attended. The results indicate a better performance of the nonlinear controller with an increase in the robustness and faster response than the linear controller.

  10. Electrical Properties of an m × n Hammock Network

    NASA Astrophysics Data System (ADS)

    Tan, Zhen; Tan, Zhi-Zhong; Zhou, Ling

    2018-05-01

    Electrical property is an important problem in the field of natural science and physics, which usually involves potential, current and resistance in the electric circuit. We investigate the electrical properties of an arbitrary hammock network, which has not been resolved before, and propose the exact potential formula of an arbitrary m × n hammock network by means of the Recursion-Transform method with current parameters (RT-I) pioneered by one of us [Z. Z. Tan, Phys. Rev. E 91 (2015) 052122], and the branch currents and equivalent resistance of the network are derived naturally. Our key technique is to setting up matrix equations and making matrix transformation, the potential formula derived is a meaningful discovery, which deduces many novel applications. The discovery of potential formula of the hammock network provides new theoretical tools and techniques for related scientific research. Supported by the Natural Science Foundation of Jiangsu Province under Grant No. BK20161278

  11. Pathology in Continuous Infusion Studies in Rodents and Non-Rodents and ITO (Infusion Technology Organisation)-Recommended Protocol for Tissue Sampling and Terminology for Procedure-Related Lesions

    PubMed Central

    Weber, Klaus; Mowat, Vasanthi; Hartmann, Elke; Razinger, Tanja; Chevalier, Hans-Jörg; Blumbach, Kai; Green, Owen P.; Kaiser, Stefan; Corney, Stephen; Jackson, Ailsa; Casadesus, Agustin

    2011-01-01

    Many variables may affect the outcome of continuous infusion studies. The results largely depend on the experience of the laboratory performing these studies, the technical equipment used, the choice of blood vessels and hence the surgical technique as well the quality of pathological evaluation. The latter is of major interest due to the fact that the pathologist is not involved until necropsy in most cases, i.e. not dealing with the complicated surgical or in-life procedures of this study type. The technique of tissue sampling during necropsy and the histology processing procedures may influence the tissues presented for evaluation, hence the pathologist may be a source of misinterpretation. Therefore, ITO proposes a tissue sampling procedure and a standard nomenclature for pathological lesions for all sites and tissues in contact with the port-access and/or catheter system. PMID:22272050

  12. Evaluation of trade-offs in costs and environmental impacts for returnable packaging implementation

    NASA Astrophysics Data System (ADS)

    Jarupan, Lerpong; Kamarthi, Sagar V.; Gupta, Surendra M.

    2004-02-01

    The main thrust of returnable packaging these days is to provide logistical services through transportation and distribution of products and be environmentally friendly. Returnable packaging and reverse logistics concepts have converged to mitigate the adverse effect of packaging materials entering the solid waste stream. Returnable packaging must be designed by considering the trade-offs between costs and environmental impact to satisfy manufacturers and environmentalists alike. The cost of returnable packaging entails such items as materials, manufacturing, collection, storage and disposal. Environmental impacts are explicitly linked with solid waste, air pollution, and water pollution. This paper presents a multi-criteria evaluation technique to assist decision-makers for evaluating the trade-offs in costs and environmental impact during the returnable packaging design process. The proposed evaluation technique involves a combination of multiple objective integer linear programming and analytic hierarchy process. A numerical example is used to illustrate the methodology.

  13. Multi-layer holographic bifurcative neural network system for real-time adaptive EOS data analysis

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Huang, K. S.; Diep, J.

    1993-01-01

    Optical data processing techniques have the inherent advantage of high data throughout, low weight and low power requirements. These features are particularly desirable for onboard spacecraft in-situ real-time data analysis and data compression applications. the proposed multi-layer optical holographic neural net pattern recognition technique will utilize the nonlinear photorefractive devices for real-time adaptive learning to classify input data content and recognize unexpected features. Information can be stored either in analog or digital form in a nonlinear photofractive device. The recording can be accomplished in time scales ranging from milliseconds to microseconds. When a system consisting of these devices is organized in a multi-layer structure, a feedforward neural net with bifurcating data classification capability is formed. The interdisciplinary research will involve the collaboration with top digital computer architecture experts at the University of Southern California.

  14. Significance of clustering and classification applications in digital and physical libraries

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios

    2015-02-01

    Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.

  15. A symbolic/subsymbolic interface protocol for cognitive modeling

    PubMed Central

    Simen, Patrick; Polk, Thad

    2009-01-01

    Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520

  16. Optical trapping for complex fluid microfluidics

    NASA Astrophysics Data System (ADS)

    Vestad, Tor; Oakey, John; Marr, David W. M.

    2004-10-01

    Many proposed applications of microfluidics involve the manipulation of complex fluid mixtures such as blood or bacterial suspensions. To sort and handle the constituent particles within these suspensions, we have developed a miniaturized automated cell sorter using optical traps. This microfluidic cell sorter offers the potential to perform chip-top microbiology more rapidly and with less associated hardware and preparation time than other techniques currently available. To realize the potential of this technology in practical clinical and consumer lab-on-a-chip devices however, microscale control of not only particulates but also the fluid phase must be achieved. To address this, we have developed a mechanical fluid control scheme that integrates well with our optical separations approach. We demonstrate here a combined technique, one that employs both mechanical actuation and optical trapping for the precise control of complex suspensions. This approach enables both cell and particle separations as well as the subsequent fluid control required for the completion of complex analyses.

  17. Wind-instrument reflection function measurements in the time domain.

    PubMed

    Keefe, D H

    1996-04-01

    Theoretical and computational analyses of wind-instrument sound production in the time domain have emerged as useful tools for understanding musical instrument acoustics, yet there exist few experimental measurements of the air-column response directly in the time domain. A new experimental, time-domain technique is proposed to measure the reflection function response of woodwind and brass-instrument air columns. This response is defined at the location of sound regeneration in the mouthpiece or double reed. A probe assembly comprised of an acoustic source and microphone is inserted directly into the air column entryway using a foam plug to ensure a leak-free fit. An initial calibration phase involves measurements on a single cylindrical tube of known dimensions. Measurements are presented on an alto saxophone and euphonium. The technique has promise for testing any musical instrument air columns using a single probe assembly and foam plugs over a range of diameters typical of air-column entryways.

  18. [Surgical correction of cleft palate].

    PubMed

    Kimura, F T; Pavia Noble, A; Soriano Padilla, F; Soto Miranda, A; Medellín Rodríguez, A

    1990-04-01

    This study presents a statistical review of corrective surgery for cleft palate, based on cases treated at the maxillo-facial surgery units of the Pediatrics Hospital of the Centro Médico Nacional and at Centro Médico La Raza of the National Institute of Social Security of Mexico, over a five-year period. Interdisciplinary management as performed at the Cleft-Palate Clinic, in an integrated approach involving specialists in maxillo-facial surgery, maxillar orthopedics, genetics, social work and mental hygiene, pursuing to reestablish the stomatological and psychological functions of children afflicted by cleft palate, is amply described. The frequency and classification of the various techniques practiced in that service are described, as well as surgical statistics for 188 patients, which include a total of 256 palate surgeries performed from March 1984 to March 1989, applying three different techniques and proposing a combination of them in a single surgical time, in order to avoid complementary surgery.

  19. Video Observations Encompassing the 2002 Leonid Storm: First Results and a Revised Photometric Procedure for Video Meteor Analysis

    NASA Technical Reports Server (NTRS)

    Cooke, William J.; Suggs, Robert; Swift, Wesley; Gural, Peter S.; Brown, Peter; Ellis, Jim (Technical Monitor)

    2002-01-01

    During the 2001 Leonid storm, Marshall Space Flight Center, with the cooperation of the University of Western Ontario and the United States Air Force, deployed 6 teams of observers equipped with intensified video systems to sites located in North America, the Pacific, and Mongolia. The campaign was extremely successful, with the entire period of enhanced Leonid activity (over 16 hours) captured on video tape in a consistent manner. We present the first results from the analysis of this unique, 2 terabyte data set and discuss the problems involved in reducing large amounts of video meteor data. In particular, the question of how to determine meteor masses though photometric analysis will be re-examined, and new techniques will be proposed that eliminate some of the deficiencies suffered by the techniques currently employed in video meteor analysis.

  20. Identification and addressing reduction-related misconceptions

    NASA Astrophysics Data System (ADS)

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-07-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.

  1. NIR hyperspectral compressive imager based on a modified Fabry–Perot resonator

    NASA Astrophysics Data System (ADS)

    Oiknine, Yaniv; August, Isaac; Blumberg, Dan G.; Stern, Adrian

    2018-04-01

    The acquisition of hyperspectral (HS) image datacubes with available 2D sensor arrays involves a time consuming scanning process. In the last decade, several compressive sensing (CS) techniques were proposed to reduce the HS acquisition time. In this paper, we present a method for near-infrared (NIR) HS imaging which relies on our rapid CS resonator spectroscopy technique. Within the framework of CS, and by using a modified Fabry–Perot resonator, a sequence of spectrally modulated images is used to recover NIR HS datacubes. Owing to the innovative CS design, we demonstrate the ability to reconstruct NIR HS images with hundreds of spectral bands from an order of magnitude fewer measurements, i.e. with a compression ratio of about 10:1. This high compression ratio, together with the high optical throughput of the system, facilitates fast acquisition of large HS datacubes.

  2. Searching fundamental information in ordinary differential equations. Nondimensionalization technique.

    PubMed

    Sánchez Pérez, J F; Conesa, M; Alhama, I; Alhama, F; Cánovas, M

    2017-01-01

    Classical dimensional analysis and nondimensionalization are assumed to be two similar approaches in the search for dimensionless groups. Both techniques, simplify the study of many problems. The first approach does not need to know the mathematical model, being sufficient a deep understanding of the physical phenomenon involved, while the second one begins with the governing equations and reduces them to their dimensionless form by simple mathematical manipulations. In this work, a formal protocol is proposed for applying the nondimensionalization process to ordinary differential equations, linear or not, leading to dimensionless normalized equations from which the resulting dimensionless groups have two inherent properties: In one hand, they are physically interpreted as balances between counteracting quantities in the problem, and on the other hand, they are of the order of magnitude unity. The solutions provided by nondimensionalization are more precise in every case than those from dimensional analysis, as it is illustrated by the applications studied in this work.

  3. Low-rank matrix decomposition and spatio-temporal sparse recovery for STAP radar

    DOE PAGES

    Sen, Satyabrata

    2015-08-04

    We develop space-time adaptive processing (STAP) methods by leveraging the advantages of sparse signal processing techniques in order to detect a slowly-moving target. We observe that the inherent sparse characteristics of a STAP problem can be formulated as the low-rankness of clutter covariance matrix when compared to the total adaptive degrees-of-freedom, and also as the sparse interference spectrum on the spatio-temporal domain. By exploiting these sparse properties, we propose two approaches for estimating the interference covariance matrix. In the first approach, we consider a constrained matrix rank minimization problem (RMP) to decompose the sample covariance matrix into a low-rank positivemore » semidefinite and a diagonal matrix. The solution of RMP is obtained by applying the trace minimization technique and the singular value decomposition with matrix shrinkage operator. Our second approach deals with the atomic norm minimization problem to recover the clutter response-vector that has a sparse support on the spatio-temporal plane. We use convex relaxation based standard sparse-recovery techniques to find the solutions. With extensive numerical examples, we demonstrate the performances of proposed STAP approaches with respect to both the ideal and practical scenarios, involving Doppler-ambiguous clutter ridges, spatial and temporal decorrelation effects. As a result, the low-rank matrix decomposition based solution requires secondary measurements as many as twice the clutter rank to attain a near-ideal STAP performance; whereas the spatio-temporal sparsity based approach needs a considerably small number of secondary data.« less

  4. Achieve Location Privacy-Preserving Range Query in Vehicular Sensing

    PubMed Central

    Lu, Rongxing; Ma, Maode; Bao, Haiyong

    2017-01-01

    Modern vehicles are equipped with a plethora of on-board sensors and large on-board storage, which enables them to gather and store various local-relevant data. However, the wide application of vehicular sensing has its own challenges, among which location-privacy preservation and data query accuracy are two critical problems. In this paper, we propose a novel range query scheme, which helps the data requester to accurately retrieve the sensed data from the distributive on-board storage in vehicular ad hoc networks (VANETs) with location privacy preservation. The proposed scheme exploits structured scalars to denote the locations of data requesters and vehicles, and achieves the privacy-preserving location matching with the homomorphic Paillier cryptosystem technique. Detailed security analysis shows that the proposed range query scheme can successfully preserve the location privacy of the involved data requesters and vehicles, and protect the confidentiality of the sensed data. In addition, performance evaluations are conducted to show the efficiency of the proposed scheme, in terms of computation delay and communication overhead. Specifically, the computation delay and communication overhead are not dependent on the length of the scalar, and they are only proportional to the number of vehicles. PMID:28786943

  5. Achieve Location Privacy-Preserving Range Query in Vehicular Sensing.

    PubMed

    Kong, Qinglei; Lu, Rongxing; Ma, Maode; Bao, Haiyong

    2017-08-08

    Modern vehicles are equipped with a plethora of on-board sensors and large on-board storage, which enables them to gather and store various local-relevant data. However, the wide application of vehicular sensing has its own challenges, among which location-privacy preservation and data query accuracy are two critical problems. In this paper, we propose a novel range query scheme, which helps the data requester to accurately retrieve the sensed data from the distributive on-board storage in vehicular ad hoc networks (VANETs) with location privacy preservation. The proposed scheme exploits structured scalars to denote the locations of data requesters and vehicles, and achieves the privacy-preserving location matching with the homomorphic Paillier cryptosystem technique. Detailed security analysis shows that the proposed range query scheme can successfully preserve the location privacy of the involved data requesters and vehicles, and protect the confidentiality of the sensed data. In addition, performance evaluations are conducted to show the efficiency of the proposed scheme, in terms of computation delay and communication overhead. Specifically, the computation delay and communication overhead are not dependent on the length of the scalar, and they are only proportional to the number of vehicles.

  6. Low power femtosecond tip-based nanofabrication with advanced control

    NASA Astrophysics Data System (ADS)

    Liu, Jiangbo; Guo, Zhixiong; Zou, Qingze

    2018-02-01

    In this paper, we propose an approach to enable the use of low power femtosecond laser in tip-based nanofabrication (TBN) without thermal damage. One major challenge in laser-assisted TBN is in maintaining precision control of the tip-surface positioning throughout the fabrication process. An advanced iterative learning control technique is exploited to overcome this challenge in achieving high-quality patterning of arbitrary shape on a metal surface. The experimental results are analyzed to understand the ablation mechanism involved. Specifically, the near-field radiation enhancement is examined via the surface-enhanced Raman scattering effect, and it was revealed the near-field enhanced plasma-mediated ablation. Moreover, silicon nitride tip is utilized to alleviate the adverse thermal damage. Experiment results including line patterns fabricated under different writing speeds and an "R" pattern are presented. The fabrication quality with regard to the line width, depth, and uniformity is characterized to demonstrate the efficacy of the proposed approach.

  7. Architecture of a Diels-Alderase ribozyme with a preformed catalytic pocket.

    PubMed

    Keiper, Sonja; Bebenroth, Dirk; Seelig, Burckhard; Westhof, Eric; Jäschke, Andres

    2004-09-01

    Artificial ribozymes catalyze a variety of chemical reactions. Their structures and reaction mechanisms are largely unknown. We have analyzed a ribozyme catalyzing Diels-Alder cycloaddition reactions by comprehensive mutation analysis and a variety of probing techniques. New tertiary interactions involving base pairs between nucleotides of the 5' terminus and a large internal loop forming a pseudoknot fold were identified. The probing data indicate a preformed tertiary structure that shows no major changes on substrate or product binding. Based on these observations, a molecular architecture featuring a Y-shaped arrangement is proposed. The tertiary structure is formed in a rather unusual way; that is, the opposite sides of the asymmetric internal loop are clamped by the four 5'-terminal nucleotides, forming two adjacent two base-pair helices. It is proposed that the catalytic pocket is formed by a wedge within one of these helices.

  8. Intelligent control based on fuzzy logic and neural net theory

    NASA Technical Reports Server (NTRS)

    Lee, Chuen-Chien

    1991-01-01

    In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

  9. A study of overproduction and enhanced secretion of enzymes. Quarterly report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dashek, W.V.

    1993-09-01

    Wood decay within forests, a significant renewable photosynthetic energy resource, is caused primarily by Basidiomycetous fungi, e.g., white rot fungi. These organisms possess the ability to degrade lignin, cellulose and hemicellulose, the main organic polymers of wood. In the case of the white rot fungi, e.g., Coriolus versicolor, the capacity results from the fungus` ability to elaborate extracellular cellulolytic and ligninolytic enzymes. With regard to the latter, at least one of the enzymes, polyphenol oxidase (PPO) appears within a defined growth medium. This proposal focuses on the over-production and enhanced secretion of PPO, cellulase and lignin peroxidase. There are twomore » major sections to the proposal: (1) overproduction of lignocellulolytic enzymes by genetic engineering methodologies and hyper-production and enhanced secretion of these enzymes by biochemical/electro microscopical techniques and (2) the biochemical/electron microscopical method involves substrate induction and the time-dependent addition of respiration and PPO enzymes.« less

  10. Solid-perforated panel layout optimization by topology optimization based on unified transfer matrix.

    PubMed

    Kim, Yoon Jae; Kim, Yoon Young

    2010-10-01

    This paper presents a numerical method for the optimization of the sequencing of solid panels, perforated panels and air gaps and their respective thickness for maximizing sound transmission loss and/or absorption. For the optimization, a method based on the topology optimization formulation is proposed. It is difficult to employ only the commonly-used material interpolation technique because the involved layers exhibit fundamentally different acoustic behavior. Thus, an optimization method formulation using a so-called unified transfer matrix is newly proposed. The key idea is to form elements of the transfer matrix such that interpolated elements by the layer design variables can be those of air, perforated and solid panel layers. The problem related to the interpolation is addressed and bench mark-type problems such as sound transmission or absorption maximization problems are solved to check the efficiency of the developed method.

  11. Extinction-ratio-independent electrical method for measuring chirp parameters of Mach-Zehnder modulators using frequency-shifted heterodyne.

    PubMed

    Zhang, Shangjian; Wang, Heng; Zou, Xinhai; Zhang, Yali; Lu, Rongguo; Liu, Yong

    2015-06-15

    An extinction-ratio-independent electrical method is proposed for measuring chirp parameters of Mach-Zehnder electric-optic intensity modulators based on frequency-shifted optical heterodyne. The method utilizes the electrical spectrum analysis of the heterodyne products between the intensity modulated optical signal and the frequency-shifted optical carrier, and achieves the intrinsic chirp parameters measurement at microwave region with high-frequency resolution and wide-frequency range for the Mach-Zehnder modulator with a finite extinction ratio. Moreover, the proposed method avoids calibrating the responsivity fluctuation of the photodiode in spite of the involved photodetection. Chirp parameters as a function of modulation frequency are experimentally measured and compared to those with the conventional optical spectrum analysis method. Our method enables an extinction-ratio-independent and calibration-free electrical measurement of Mach-Zehnder intensity modulators by using the high-resolution frequency-shifted heterodyne technique.

  12. In Vivo Pattern Classification of Ingestive Behavior in Ruminants Using FBG Sensors and Machine Learning.

    PubMed

    Pegorini, Vinicius; Karam, Leandro Zen; Pitta, Christiano Santos Rocha; Cardoso, Rafael; da Silva, Jean Carlos Cardozo; Kalinowski, Hypolito José; Ribeiro, Richardson; Bertotti, Fábio Luiz; Assmann, Tangriani Simioni

    2015-11-11

    Pattern classification of ingestive behavior in grazing animals has extreme importance in studies related to animal nutrition, growth and health. In this paper, a system to classify chewing patterns of ruminants in in vivo experiments is developed. The proposal is based on data collected by optical fiber Bragg grating sensors (FBG) that are processed by machine learning techniques. The FBG sensors measure the biomechanical strain during jaw movements, and a decision tree is responsible for the classification of the associated chewing pattern. In this study, patterns associated with food intake of dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior were monitored: rumination and idleness. Experimental results show that the proposed approach for pattern classification is capable of differentiating the five patterns involved in the chewing process with an overall accuracy of 94%.

  13. In Vivo Pattern Classification of Ingestive Behavior in Ruminants Using FBG Sensors and Machine Learning

    PubMed Central

    Pegorini, Vinicius; Karam, Leandro Zen; Pitta, Christiano Santos Rocha; Cardoso, Rafael; da Silva, Jean Carlos Cardozo; Kalinowski, Hypolito José; Ribeiro, Richardson; Bertotti, Fábio Luiz; Assmann, Tangriani Simioni

    2015-01-01

    Pattern classification of ingestive behavior in grazing animals has extreme importance in studies related to animal nutrition, growth and health. In this paper, a system to classify chewing patterns of ruminants in in vivo experiments is developed. The proposal is based on data collected by optical fiber Bragg grating sensors (FBG) that are processed by machine learning techniques. The FBG sensors measure the biomechanical strain during jaw movements, and a decision tree is responsible for the classification of the associated chewing pattern. In this study, patterns associated with food intake of dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior were monitored: rumination and idleness. Experimental results show that the proposed approach for pattern classification is capable of differentiating the five patterns involved in the chewing process with an overall accuracy of 94%. PMID:26569250

  14. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  15. A novel framework for virtual prototyping of rehabilitation exoskeletons.

    PubMed

    Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D

    2013-06-01

    Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.

  16. Role of weakest links and system-size scaling in multiscale modeling of stochastic plasticity

    NASA Astrophysics Data System (ADS)

    Ispánovity, Péter Dusán; Tüzes, Dániel; Szabó, Péter; Zaiser, Michael; Groma, István

    2017-02-01

    Plastic deformation of crystalline and amorphous matter often involves intermittent local strain burst events. To understand the physical background of the phenomenon a minimal stochastic mesoscopic model was introduced, where details of the microstructure evolution are statistically represented in terms of a fluctuating local yield threshold. In the present paper we propose a method for determining the corresponding yield stress distribution for the case of crystal plasticity from lower scale discrete dislocation dynamics simulations which we combine with weakest link arguments. The success of scale linking is demonstrated by comparing stress-strain curves obtained from the resulting mesoscopic and the underlying discrete dislocation models in the microplastic regime. As shown by various scaling relations they are statistically equivalent and behave identically in the thermodynamic limit. The proposed technique is expected to be applicable to different microstructures and also to amorphous materials.

  17. A new phase correction method in NMR imaging based on autocorrelation and histogram analysis.

    PubMed

    Ahn, C B; Cho, Z H

    1987-01-01

    A new statistical approach to phase correction in NMR imaging is proposed. The proposed scheme consists of first-and zero-order phase corrections each by the inverse multiplication of estimated phase error. The first-order error is estimated by the phase of autocorrelation calculated from the complex valued phase distorted image while the zero-order correction factor is extracted from the histogram of phase distribution of the first-order corrected image. Since all the correction procedures are performed on the spatial domain after completion of data acquisition, no prior adjustments or additional measurements are required. The algorithm can be applicable to most of the phase-involved NMR imaging techniques including inversion recovery imaging, quadrature modulated imaging, spectroscopic imaging, and flow imaging, etc. Some experimental results with inversion recovery imaging as well as quadrature spectroscopic imaging are shown to demonstrate the usefulness of the algorithm.

  18. A review of active learning approaches to experimental design for uncovering biological networks

    PubMed Central

    2017-01-01

    Various types of biological knowledge describe networks of interactions among elementary entities. For example, transcriptional regulatory networks consist of interactions among proteins and genes. Current knowledge about the exact structure of such networks is highly incomplete, and laboratory experiments that manipulate the entities involved are conducted to test hypotheses about these networks. In recent years, various automated approaches to experiment selection have been proposed. Many of these approaches can be characterized as active machine learning algorithms. Active learning is an iterative process in which a model is learned from data, hypotheses are generated from the model to propose informative experiments, and the experiments yield new data that is used to update the model. This review describes the various models, experiment selection strategies, validation techniques, and successful applications described in the literature; highlights common themes and notable distinctions among methods; and identifies likely directions of future research and open problems in the area. PMID:28570593

  19. The EM Method in a Probabilistic Wavelet-Based MRI Denoising

    PubMed Central

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images. PMID:26089959

  20. The EM Method in a Probabilistic Wavelet-Based MRI Denoising.

    PubMed

    Martin-Fernandez, Marcos; Villullas, Sergio

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images.

  1. 45 CFR 46.118 - Applications and proposals lacking definite plans for involvement of human subjects.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Applications and proposals lacking definite plans for involvement of human subjects. 46.118 Section 46.118 Public Welfare Department of Health and Human... Research Subjects § 46.118 Applications and proposals lacking definite plans for involvement of human...

  2. 45 CFR 46.118 - Applications and proposals lacking definite plans for involvement of human subjects.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Applications and proposals lacking definite plans for involvement of human subjects. 46.118 Section 46.118 Public Welfare DEPARTMENT OF HEALTH AND HUMAN... Research Subjects § 46.118 Applications and proposals lacking definite plans for involvement of human...

  3. 45 CFR 46.118 - Applications and proposals lacking definite plans for involvement of human subjects.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Applications and proposals lacking definite plans for involvement of human subjects. 46.118 Section 46.118 Public Welfare DEPARTMENT OF HEALTH AND HUMAN... Research Subjects § 46.118 Applications and proposals lacking definite plans for involvement of human...

  4. 45 CFR 46.118 - Applications and proposals lacking definite plans for involvement of human subjects.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Applications and proposals lacking definite plans for involvement of human subjects. 46.118 Section 46.118 Public Welfare DEPARTMENT OF HEALTH AND HUMAN... Research Subjects § 46.118 Applications and proposals lacking definite plans for involvement of human...

  5. Experimental and theoretical identification of a four- acoustic-inputs/two-vibration-outputs hearing system

    NASA Astrophysics Data System (ADS)

    Balaji, P. A.

    1999-07-01

    A cricket's ear is a directional acoustic sensor. It has a remarkable level of sensitivity to the direction of sound propagation in a narrow frequency bandwidth of 4-5 KHz. Because of its complexity, the directional sensitivity has long intrigued researchers. The cricket's ear is a four-acoustic-inputs/two-vibration-outputs system. In this dissertation, this system is examined in depth, both experimentally and theoretically, with a primary goal to understand the mechanics involved in directional hearing. Experimental identification of the system is done by using random signal processing techniques. Theoretical identification of the system is accomplished by analyzing sound transmission through complex trachea of the ear. Finally, a description of how the cricket achieves directional hearing sensitivity is proposed. The fundamental principle involved in directional heating of the cricket has been utilized to design a device to obtain a directional signal from non- directional inputs.

  6. Improved retention of phosphorus donors in germanium using a non-amorphizing fluorine co-implantation technique

    NASA Astrophysics Data System (ADS)

    Monmeyran, Corentin; Crowe, Iain F.; Gwilliam, Russell M.; Heidelberger, Christopher; Napolitani, Enrico; Pastor, David; Gandhi, Hemi H.; Mazur, Eric; Michel, Jürgen; Agarwal, Anuradha M.; Kimerling, Lionel C.

    2018-04-01

    Co-doping with fluorine is a potentially promising method for defect passivation to increase the donor electrical activation in highly doped n-type germanium. However, regular high dose donor-fluorine co-implants, followed by conventional thermal treatment of the germanium, typically result in a dramatic loss of the fluorine, as a result of the extremely large diffusivity at elevated temperatures, partly mediated by the solid phase epitaxial regrowth. To circumvent this problem, we propose and experimentally demonstrate two non-amorphizing co-implantation methods; one involving consecutive, low dose fluorine implants, intertwined with rapid thermal annealing and the second, involving heating of the target wafer during implantation. Our study confirms that the fluorine solubility in germanium is defect-mediated and we reveal the extent to which both of these strategies can be effective in retaining large fractions of both the implanted fluorine and, critically, phosphorus donors.

  7. Stress Recovery and Error Estimation for Shell Structures

    NASA Technical Reports Server (NTRS)

    Yazdani, A. A.; Riggs, H. R.; Tessler, A.

    2000-01-01

    The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.

  8. Landsat D Thematic Mapper image dimensionality reduction and geometric correction accuracy

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1986-01-01

    To characterize and quantify the performance of the Landsat thematic mapper (TM), techniques for dimensionality reduction by linear transformation have been studied and evaluated and the accuracy of the correction of geometric errors in TM images analyzed. Theoretical evaluations and comparisons for existing methods for the design of linear transformation for dimensionality reduction are presented. These methods include the discrete Karhunen Loeve (KL) expansion, Multiple Discriminant Analysis (MDA), Thematic Mapper (TM)-Tasseled Cap Linear Transformation and Singular Value Decomposition (SVD). A unified approach to these design problems is presented in which each method involves optimizing an objective function with respect to the linear transformation matrix. From these studies, four modified methods are proposed. They are referred to as the Space Variant Linear Transformation, the KL Transform-MDA hybrid method, and the First and Second Version of the Weighted MDA method. The modifications involve the assignment of weights to classes to achieve improvements in the class conditional probability of error for classes with high weights. Experimental evaluations of the existing and proposed methods have been performed using the six reflective bands of the TM data. It is shown that in terms of probability of classification error and the percentage of the cumulative eigenvalues, the six reflective bands of the TM data require only a three dimensional feature space. It is shown experimentally as well that for the proposed methods, the classes with high weights have improvements in class conditional probability of error estimates as expected.

  9. Validation of an image-based technique to assess the perceptual quality of clinical chest radiographs with an observer study

    NASA Astrophysics Data System (ADS)

    Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan

    2014-03-01

    We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.

  10. Identifying indicators of illegal behaviour: carnivore killing in human-managed landscapes.

    PubMed

    St John, Freya A V; Keane, Aidan M; Edwards-Jones, Gareth; Jones, Lauren; Yarnell, Richard W; Jones, Julia P G

    2012-02-22

    Managing natural resources often depends on influencing people's behaviour, however effectively targeting interventions to discourage environmentally harmful behaviours is challenging because those involved may be unwilling to identify themselves. Non-sensitive indicators of sensitive behaviours are therefore needed. Previous studies have investigated people's attitudes, assuming attitudes reflect behaviour. There has also been interest in using people's estimates of the proportion of their peers involved in sensitive behaviours to identify those involved, since people tend to assume that others behave like themselves. However, there has been little attempt to test the potential of such indicators. We use the randomized response technique (RRT), designed for investigating sensitive behaviours, to estimate the proportion of farmers in north-eastern South Africa killing carnivores, and use a modified logistic regression model to explore relationships between our best estimates of true behaviour (from RRT) and our proposed non-sensitive indicators (including farmers' attitudes, and estimates of peer-behaviour). Farmers' attitudes towards carnivores, question sensitivity and estimates of peers' behaviour, predict the likelihood of farmers killing carnivores. Attitude and estimates of peer-behaviour are useful indicators of involvement in illicit behaviours and may be used to identify groups of people to engage in interventions aimed at changing behaviour.

  11. Robust Hidden Markov Model based intelligent blood vessel detection of fundus images.

    PubMed

    Hassan, Mehdi; Amin, Muhammad; Murtza, Iqbal; Khan, Asifullah; Chaudhry, Asmatullah

    2017-11-01

    In this paper, we consider the challenging problem of detecting retinal vessel networks. Precise detection of retinal vessel networks is vital for accurate eye disease diagnosis. Most of the blood vessel tracking techniques may not properly track vessels in presence of vessels' occlusion. Owing to problem in sensor resolution or acquisition of fundus images, it is possible that some part of vessel may occlude. In this scenario, it becomes a challenging task to accurately trace these vital vessels. For this purpose, we have proposed a new robust and intelligent retinal vessel detection technique on Hidden Markov Model. The proposed model is able to successfully track vessels in the presence of occlusion. The effectiveness of the proposed technique is evaluated on publically available standard DRIVE dataset of the fundus images. The experiments show that the proposed technique not only outperforms the other state of the art methodologies of retinal blood vessels segmentation, but it is also capable of accurate occlusion handling in retinal vessel networks. The proposed technique offers better average classification accuracy, sensitivity, specificity, and area under the curve (AUC) of 95.7%, 81.0%, 97.0%, and 90.0% respectively, which shows the usefulness of the proposed technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Techniques and recommendations for the inclusion of users with autism in the design of assistive technologies.

    PubMed

    Francis, Peter; Mellor, David; Firth, Lucy

    2009-01-01

    The increasing numbers of technology platforms offer opportunities to develop new visual assistive aids for people with autism. However, their involvement in the design of such aids is critical to their short-term uptake and longer term use. Using a three-round Delphi study involving seven Australian psychologists specializing in treating people with autism, the authors explored the utility of four techniques that might be implemented to involve users with autism in the design process. The authors found that individual users from the target group would be likely to respond differently to the techniques and that no technique was clearly better than any other. Recommendations for using these techniques to involve individuals with autism in the design of assistive technologies are suggested.

  13. Delayed Monocular SLAM Approach Applied to Unmanned Aerial Vehicles.

    PubMed

    Munguia, Rodrigo; Urzua, Sarquis; Grau, Antoni

    2016-01-01

    In recent years, many researchers have addressed the issue of making Unmanned Aerial Vehicles (UAVs) more and more autonomous. In this context, the state estimation of the vehicle position is a fundamental necessity for any application involving autonomy. However, the problem of position estimation could not be solved in some scenarios, even when a GPS signal is available, for instance, an application requiring performing precision manoeuvres in a complex environment. Therefore, some additional sensory information should be integrated into the system in order to improve accuracy and robustness. In this work, a novel vision-based simultaneous localization and mapping (SLAM) method with application to unmanned aerial vehicles is proposed. One of the contributions of this work is to design and develop a novel technique for estimating features depth which is based on a stochastic technique of triangulation. In the proposed method the camera is mounted over a servo-controlled gimbal that counteracts the changes in attitude of the quadcopter. Due to the above assumption, the overall problem is simplified and it is focused on the position estimation of the aerial vehicle. Also, the tracking process of visual features is made easier due to the stabilized video. Another contribution of this work is to demonstrate that the integration of very noisy GPS measurements into the system for an initial short period of time is enough to initialize the metric scale. The performance of this proposed method is validated by means of experiments with real data carried out in unstructured outdoor environments. A comparative study shows that, when compared with related methods, the proposed approach performs better in terms of accuracy and computational time.

  14. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling.

    PubMed

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  15. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling

    NASA Astrophysics Data System (ADS)

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Objective. Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. Approach. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Main results. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. Significance. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  16. Extraction of honey from underground bee nests by central African chimpanzees (Pan troglodytes troglodytes) in Loango National Park, Gabon: Techniques and individual differences.

    PubMed

    Estienne, Vittoria; Stephens, Colleen; Boesch, Christophe

    2017-08-01

    A detailed analysis of tool use behaviors can disclose the underlying cognitive traits of the users. We investigated the technique used by wild chimpanzees to extract the underground nests of stingless bees (Meliplebeia lendliana), which represent a hard-to-reach resource given their highly undetectable location. Using remote-sensor camera trap footage, we analyzed 151 visits to 50 different bee nests by 18 adult chimpanzees of both sexes. We quantified the degree of complexity and flexibility of this technique by looking at the behavioral repertoire and at its structural organization. We used Generalized Linear Mixed Models to test whether individuals differed in their action repertoire sizes and in their action sequencing patterns, as well as in their preferences of use of different behavioral elements (namely, actions, and grip types). We found that subjects showed non-randomly organized sequences of actions and that the occurrence of certain actions was predicted by the type of the previous action in the sequences. Subjects did not differ in their repertoire sizes, and all used extractive actions involving tools more often than manual digging. As for the type of grip employed, the grip involving the coordinated use of hands and feet together was most frequently used by all subjects when perforating, and we detected significant individual preferences in this domain. Overall, we describe a highly complex and flexible extractive technique, and propose the existence of inter-individual variation in it. We discuss our results in the light of the evolution of higher cognitive abilities in the human lineage. © 2017 Wiley Periodicals, Inc.

  17. Novel Technique for Hepatic Fiducial Marker Placement for Stereotactic Body Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarraya, Hajer, E-mail: h-jarraya@o-lambret.fr; Chalayer, Chloé; Tresch, Emmanuelle

    2014-09-01

    Purpose: To report experience with fiducial marker insertion and describe an advantageous, novel technique for fiducial placement in the liver for stereotactic body radiation therapy with respiratory tracking. Methods and Materials: We implanted 1444 fiducials (single: 834; linked: 610) in 328 patients with 424 hepatic lesions. Two methods of implantation were compared: the standard method (631 single fiducials) performed on 153 patients from May 2007 to May 2010, and the cube method (813 fiducials: 610 linked/203 single) applied to 175 patients from April 2010 to March 2013. The standard method involved implanting a single marker at a time. The novel techniquemore » entailed implanting 2 pairs of linked markers when possible in a way to occupy the perpendicular edges of a cube containing the tumor inside. Results: Mean duration of the cube method was shorter than the standard method (46 vs 61 minutes; P<.0001). Median numbers of skin and subcapsular entries were significantly smaller with the cube method (2 vs 4, P<.0001, and 2 vs 4, P<.0001, respectively). The rate of overall complications (total, major, and minor) was significantly lower in the cube method group compared with the standard method group (5.7% vs 13.7%; P=.013). Major complications occurred while using single markers only. The success rate was 98.9% for the cube method and 99.3% for the standard method. Conclusions: We propose a new technique of hepatic fiducial implantation that makes use of linked fiducials and involves fewer skin entries and shorter time of implantation. The technique is less complication-prone and is migration-resistant.« less

  18. Collectors on illicit collecting: Higher loyalties and other techniques of neutralization in the unlawful collecting of rare and precious orchids and antiquities

    PubMed Central

    Mackenzie, Simon; Yates, Donna

    2015-01-01

    Trafficking natural objects and trafficking cultural objects have been treated separately both in regulatory policy and in criminological discussion. The former is generally taken to be ‘wildlife crime’ while the latter has come to be considered under the auspices of a debate on ‘illicit art and antiquities’. In this article we study the narrative discourse of high-end collectors of orchids and antiquities. The illicit parts of these global trades are subject to this analytical divide between wildlife trafficking and art trafficking, and this has resulted in quite different regulatory structures for each of these markets. However, the trafficking routines, the types and levels of harm involved, and the supply–demand dynamics in the trafficking of orchids and antiquities are actually quite similar, and in this study we find those structural similarities reflected in substantial common ground in the way collectors talk about their role in each market. Collectors of rare and precious orchids and antiquities valorize their participation in markets that are known to be in quite considerable degree illicit, appealing to ‘higher loyalties’ such as preservation, appreciation of aesthetic beauty and cultural edification. These higher loyalties, along with other techniques of neutralization, deplete the force of law as a guide to appropriate action. We propose that the appeal to higher loyalties is difficult to categorize as a technique of neutralization in this study as it appears to be a motivational explanation for the collectors involved. The other classic techniques of neutralization are deflective, guilt and critique reducing narrative mechanisms, while higher loyalties drives illicit behaviour in collecting markets for orchids and antiquities in ways that go significantly beyond the normal definition of neutralization. PMID:28066153

  19. Collectors on illicit collecting: Higher loyalties and other techniques of neutralization in the unlawful collecting of rare and precious orchids and antiquities.

    PubMed

    Mackenzie, Simon; Yates, Donna

    2016-08-01

    Trafficking natural objects and trafficking cultural objects have been treated separately both in regulatory policy and in criminological discussion. The former is generally taken to be 'wildlife crime' while the latter has come to be considered under the auspices of a debate on 'illicit art and antiquities'. In this article we study the narrative discourse of high-end collectors of orchids and antiquities. The illicit parts of these global trades are subject to this analytical divide between wildlife trafficking and art trafficking, and this has resulted in quite different regulatory structures for each of these markets. However, the trafficking routines, the types and levels of harm involved, and the supply-demand dynamics in the trafficking of orchids and antiquities are actually quite similar, and in this study we find those structural similarities reflected in substantial common ground in the way collectors talk about their role in each market. Collectors of rare and precious orchids and antiquities valorize their participation in markets that are known to be in quite considerable degree illicit, appealing to 'higher loyalties' such as preservation, appreciation of aesthetic beauty and cultural edification. These higher loyalties, along with other techniques of neutralization, deplete the force of law as a guide to appropriate action. We propose that the appeal to higher loyalties is difficult to categorize as a technique of neutralization in this study as it appears to be a motivational explanation for the collectors involved. The other classic techniques of neutralization are deflective, guilt and critique reducing narrative mechanisms, while higher loyalties drives illicit behaviour in collecting markets for orchids and antiquities in ways that go significantly beyond the normal definition of neutralization.

  20. Aesthetic treatment of pectus excavatum: a new endoscopic technique using a porous polyethylene implant.

    PubMed

    Grappolini, Simone; Fanzio, Paolo M; D'Addetta, Pierluca G C; Todde, Alberto; Infante, Maurizio

    2008-01-01

    Pectus excavatum is the most frequent malformation of the rib cage. Functional aspects associated with this malformation often are absent even in adults not involved in competitive sports activities. Overall, these patients often live with extreme psychological discomfort when the malformations are minor. Traditionally, the correction of these malformations has been geared toward interventions that modify the architecture of the rib cage. However, all these interventions, even the most recent, involve considerably invasive major surgery. In fact, optimal results are not always achieved with corrective surgery using the insertion of silicone prosthesis, and patients often experience complications. To correct intermediate and modest pectus excavatum in a stable manner and with the least amount of invasiveness, the authors developed a camouflage technique that uses porous prostheses made from high-density linear polyethylene. This material is generally used for reconstruction of the brain case. Between February 2001 and March 2006, in the I Unit of Plastic Surgery of the authors' Institute, 11 adult pectus excavatum patients with no previous cardiorespiratory symptoms underwent the authors' surgical technique. The average patient age was 29 years. Surgical repair was successful in all cases, and the average hospital stay was short. There were no complications during the follow-up period. The described approach repairs nonfunctional pectus excavatum in the adult with satisfying aesthetic and stable results, short hospital stay, and high patient popularity ratings. The best therapeutic option for pectus excavatum, especially with intermediate or moderate severity, is still controversial: thoracic surgery or camouflage surgery with implant? Trying to address those issues we propose a new technique by a multidisciplinary, not aggressive approach using a high density linear polyethylene implant and Omentus flap and the early analysis of our data.

  1. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  2. Alternative oil extraction methods from Echium plantagineum L. seeds using advanced techniques and green solvents.

    PubMed

    Castejón, Natalia; Luna, Pilar; Señoráns, Francisco J

    2018-04-01

    The edible oil processing industry involves large losses of organic solvent into the atmosphere and long extraction times. In this work, fast and environmentally friendly alternatives for the production of echium oil using green solvents are proposed. Advanced extraction techniques such as Pressurized Liquid Extraction (PLE), Microwave Assisted Extraction (MAE) and Ultrasound Assisted Extraction (UAE) were evaluated to efficiently extract omega-3 rich oil from Echium plantagineum seeds. Extractions were performed with ethyl acetate, ethanol, water and ethanol:water to develop a hexane-free processing method. Optimal PLE conditions with ethanol at 150 °C during 10 min produced a very similar oil yield (31.2%) to Soxhlet using hexane for 8 h (31.3%). UAE optimized method with ethanol at mild conditions (55 °C) produced a high oil yield (29.1%). Consequently, advanced extraction techniques showed good lipid yields and furthermore, the produced echium oil had the same omega-3 fatty acid composition than traditionally extracted oil. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. In silico modeling techniques for predicting the tertiary structure of human H4 receptor.

    PubMed

    Zaid, Hilal; Raiyn, Jamal; Osman, Midhat; Falah, Mizied; Srouji, Samer; Rayan, Anwar

    2016-01-01

    First cloned in 2000, the human Histamine H4 Receptor (hH4R) is the last member of the histamine receptors family discovered so far, it belongs to the GPCR super-family and is involved in a wide variety of immunological and inflammatory responses. Potential hH4R antagonists are proposed to have therapeutic potential for the treatment of allergies, inflammation, asthma and colitis. So far, no hH4R ligands have been successfully introduced to the pharmaceutical market, which creates a strong demand for new selective ligands to be developed. in silico techniques and structural based modeling are likely to facilitate the achievement of this goal. In this review paper we attempt to cover the fundamental concepts of hH4R structure modeling and its implementations in drug discovery and development, especially those that have been experimentally tested and to highlight some ideas that are currently being discussed on the dynamic nature of hH4R and GPCRs, in regards to computerized techniques for 3-D structure modeling.

  4. Automated radial basis function neural network based image classification system for diabetic retinopathy detection in retinal images

    NASA Astrophysics Data System (ADS)

    Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude

    2010-02-01

    Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.

  5. Systematic development and optimization of chemically defined medium supporting high cell density growth of Bacillus coagulans.

    PubMed

    Chen, Yu; Dong, Fengqing; Wang, Yonghong

    2016-09-01

    With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved.

  6. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  7. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Methodologies for extraction of dissolved inorganic carbon for stable carbon isotope studies : evaluation and alternatives

    USGS Publications Warehouse

    Hassan, Afifa Afifi

    1982-01-01

    The gas evolution and the strontium carbonate precipitation techniques to extract dissolved inorganic carbon (DIC) for stable carbon isotope analysis were investigated. Theoretical considerations, involving thermodynamic calculations and computer simulation pointed out several possible sources of error in delta carbon-13 measurements of the DIC and demonstrated the need for experimental evaluation of the magnitude of the error. An alternative analytical technique, equilibration with out-gassed vapor phase, is proposed. The experimental studies revealed that delta carbon-13 of the DIC extracted from a 0.01 molar NaHC03 solution by both techniques agreed within 0.1 per mil with the delta carbon-13 of the DIC extracted by the precipitation technique, and an increase of only 0.27 per mil in that extracted by the gas evolution technique. The efficiency of extraction of DIC decreased with sulfate concentration in the precipitation technique but was independent of sulfate concentration in the gas evolution technique. Both the precipitation and gas evolution technique were found to be satisfactory for extraction of DIC from different kinds of natural water for stable carbon isotope analysis, provided appropriate precautions are observed in handling the samples. For example, it was found that diffusion of atmospheric carbon dioxide does alter the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the precipitation technique; hot manganese dioxide purification changes the delta carbon-13 of carbon dioxide. (USGS)

  9. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  10. 2D DOST based local phase pattern for face recognition

    NASA Astrophysics Data System (ADS)

    Moniruzzaman, Md.; Alam, Mohammad S.

    2017-05-01

    A new two dimensional (2-D) Discrete Orthogonal Stcokwell Transform (DOST) based Local Phase Pattern (LPP) technique has been proposed for efficient face recognition. The proposed technique uses 2-D DOST as preliminary preprocessing and local phase pattern to form robust feature signature which can effectively accommodate various 3D facial distortions and illumination variations. The S-transform, is an extension of the ideas of the continuous wavelet transform (CWT), is also known for its local spectral phase properties in time-frequency representation (TFR). It provides a frequency dependent resolution of the time-frequency space and absolutely referenced local phase information while maintaining a direct relationship with the Fourier spectrum which is unique in TFR. After utilizing 2-D Stransform as the preprocessing and build local phase pattern from extracted phase information yield fast and efficient technique for face recognition. The proposed technique shows better correlation discrimination compared to alternate pattern recognition techniques such as wavelet or Gabor based face recognition. The performance of the proposed method has been tested using the Yale and extended Yale facial database under different environments such as illumination variation and 3D changes in facial expressions. Test results show that the proposed technique yields better performance compared to alternate time-frequency representation (TFR) based face recognition techniques.

  11. Scene-based nonuniformity correction technique that exploits knowledge of the focal-plane array readout architecture.

    PubMed

    Narayanan, Balaji; Hardie, Russell C; Muse, Robert A

    2005-06-10

    Spatial fixed-pattern noise is a common and major problem in modern infrared imagers owing to the nonuniform response of the photodiodes in the focal plane array of the imaging system. In addition, the nonuniform response of the readout and digitization electronics, which are involved in multiplexing the signals from the photodiodes, causes further nonuniformity. We describe a novel scene based on a nonuniformity correction algorithm that treats the aggregate nonuniformity in separate stages. First, the nonuniformity from the readout amplifiers is corrected by use of knowledge of the readout architecture of the imaging system. Second, the nonuniformity resulting from the individual detectors is corrected with a nonlinear filter-based method. We demonstrate the performance of the proposed algorithm by applying it to simulated imagery and real infrared data. Quantitative results in terms of the mean absolute error and the signal-to-noise ratio are also presented to demonstrate the efficacy of the proposed algorithm. One advantage of the proposed algorithm is that it requires only a few frames to obtain high-quality corrections.

  12. An Improved TA-SVM Method Without Matrix Inversion and Its Fast Implementation for Nonstationary Datasets.

    PubMed

    Shi, Yingzhong; Chung, Fu-Lai; Wang, Shitong

    2015-09-01

    Recently, a time-adaptive support vector machine (TA-SVM) is proposed for handling nonstationary datasets. While attractive performance has been reported and the new classifier is distinctive in simultaneously solving several SVM subclassifiers locally and globally by using an elegant SVM formulation in an alternative kernel space, the coupling of subclassifiers brings in the computation of matrix inversion, thus resulting to suffer from high computational burden in large nonstationary dataset applications. To overcome this shortcoming, an improved TA-SVM (ITA-SVM) is proposed using a common vector shared by all the SVM subclassifiers involved. ITA-SVM not only keeps an SVM formulation, but also avoids the computation of matrix inversion. Thus, we can realize its fast version, that is, improved time-adaptive core vector machine (ITA-CVM) for large nonstationary datasets by using the CVM technique. ITA-CVM has the merit of asymptotic linear time complexity for large nonstationary datasets as well as inherits the advantage of TA-SVM. The effectiveness of the proposed classifiers ITA-SVM and ITA-CVM is also experimentally confirmed.

  13. Single-shot real-time three dimensional measurement based on hue-height mapping

    NASA Astrophysics Data System (ADS)

    Wan, Yingying; Cao, Yiping; Chen, Cheng; Fu, Guangkai; Wang, Yapin; Li, Chengmeng

    2018-06-01

    A single-shot three-dimensional (3D) measurement based on hue-height mapping is proposed. The color fringe pattern is encoded by three sinusoidal fringes with the same frequency but different shifting phase into red (R), green (G) and blue (B) color channels, respectively. It is found that the hue of the captured color fringe pattern on the reference plane maintains monotonic in one period even it has the color crosstalk. Thus, unlike the traditional color phase shifting technique, the hue information is utilized to decode the color fringe pattern and map to the pixels of the fringe displacement in the proposed method. Because the monotonicity of the hue is limited within one period, displacement unwrapping is proposed to obtain the continuous displacement that is finally used to map to the height distribution. This method directly utilizes the hue under the effect of color crosstalk for mapping the height so that no color calibration is involved. Also, as it requires only single shot deformed color fringe pattern, this method can be applied into the real-time or dynamic 3D measurements.

  14. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    PubMed

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Based on interval type-2 fuzzy-neural network direct adaptive sliding mode control for SISO nonlinear systems

    NASA Astrophysics Data System (ADS)

    Lin, Tsung-Chih

    2010-12-01

    In this paper, a novel direct adaptive interval type-2 fuzzy-neural tracking control equipped with sliding mode and Lyapunov synthesis approach is proposed to handle the training data corrupted by noise or rule uncertainties for nonlinear SISO nonlinear systems involving external disturbances. By employing adaptive fuzzy-neural control theory, the update laws will be derived for approximating the uncertain nonlinear dynamical system. In the meantime, the sliding mode control method and the Lyapunov stability criterion are incorporated into the adaptive fuzzy-neural control scheme such that the derived controller is robust with respect to unmodeled dynamics, external disturbance and approximation errors. In comparison with conventional methods, the advocated approach not only guarantees closed-loop stability but also the output tracking error of the overall system will converge to zero asymptotically without prior knowledge on the upper bound of the lumped uncertainty. Furthermore, chattering effect of the control input will be substantially reduced by the proposed technique. To illustrate the performance of the proposed method, finally simulation example will be given.

  16. A proposal for measuring the degree of public health–sensitivity of patent legislation in the context of the WTO TRIPS Agreement

    PubMed Central

    Chaves, Gabriela Costa

    2007-01-01

    Abstract Objective This study aims to propose a framework for measuring the degree of public health-sensitivity of patent legislation reformed after the World Trade Organization’s TRIPS (Trade-Related Aspects of Intellectual Property Rights) Agreement entered into force. Methods The methodology for establishing and testing the proposed framework involved three main steps:(1) a literature review on TRIPS flexibilities related to the protection of public health and provisions considered “TRIPS-plus”; (2) content validation through consensus techniques (an adaptation of Delphi method); and (3) an analysis of patent legislation from nineteen Latin American and Caribbean countries. Findings The results show that the framework detected relevant differences in countries’ patent legislation, allowing for country comparisons. Conclusion The framework’s potential usefulness in monitoring patent legislation changes arises from its clear parameters for measuring patent legislation’s degree of health sensitivity. Nevertheless, it can be improved by including indicators related to government and organized society initiatives that minimize free-trade agreements’ negative effects on access to medicines. PMID:17242758

  17. Securing mobile code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware ismore » necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements on this method as well as demonstrating its implementation for various algorithms. We also examine cryptographic techniques to achieve obfuscation including encrypted functions and offer a new application to digital signature algorithms. To better understand the lack of security proofs for obfuscation techniques, we examine in detail general theoretical models of obfuscation. We explain the need for formal models in order to obtain provable security and the progress made in this direction thus far. Finally we tackle the problem of verifying remote execution. We introduce some methods of verifying remote exponentiation computations and some insight into generic computation checking.« less

  18. A new technique for solving puzzles.

    PubMed

    Makridis, Michael; Papamarkos, Nikos

    2010-06-01

    This paper proposes a new technique for solving jigsaw puzzles. The novelty of the proposed technique is that it provides an automatic jigsaw puzzle solution without any initial restriction about the shape of pieces, the number of neighbor pieces, etc. The proposed technique uses both curve- and color-matching similarity features. A recurrent procedure is applied, which compares and merges puzzle pieces in pairs, until the original puzzle image is reformed. Geometrical and color features are extracted on the characteristic points (CPs) of the puzzle pieces. CPs, which can be considered as high curvature points, are detected by a rotationally invariant corner detection algorithm. The features which are associated with color are provided by applying a color reduction technique using the Kohonen self-organized feature map. Finally, a postprocessing stage checks and corrects the relative position between puzzle pieces to improve the quality of the resulting image. Experimental results prove the efficiency of the proposed technique, which can be further extended to deal with even more complex jigsaw puzzle problems.

  19. Neuroimaging studies of the striatum in cognition Part I: healthy individuals

    PubMed Central

    Provost, Jean-Sebastien; Hanganu, Alexandru; Monchi, Oury

    2015-01-01

    The striatum has traditionally mainly been associated with playing a key role in the modulation of motor functions. Indeed, lesion studies in animals and studies of some neurological conditions in humans have brought further evidence to this idea. However, better methods of investigation have raised concerns about this notion, and it was proposed that the striatum could also be involved in different types of functions including cognitive ones. Although the notion was originally a matter of debate, it is now well-accepted that the caudate nucleus contributes to cognition, while the putamen could be involved in motor functions, and to some extent in cognitive functions as well. With the arrival of modern neuroimaging techniques in the early 1990, knowledge supporting the cognitive aspect of the striatum has greatly increased, and a substantial number of scientific papers were published studying the role of the striatum in healthy individuals. For the first time, it was possible to assess the contribution of specific areas of the brain during the execution of a cognitive task. Neuroanatomical studies have described functional loops involving the striatum and the prefrontal cortex suggesting a specific interaction between these two structures. This review examines the data up to date and provides strong evidence for a specific contribution of the fronto-striatal regions in different cognitive processes, such as set-shifting, self-initiated responses, rule learning, action-contingency, and planning. Finally, a new two-level functional model involving the prefrontal cortex and the dorsal striatum is proposed suggesting an essential role of the dorsal striatum in selecting between competing potential responses or actions, and in resolving a high level of ambiguity. PMID:26500513

  20. Neuroimaging studies of the striatum in cognition Part I: healthy individuals.

    PubMed

    Provost, Jean-Sebastien; Hanganu, Alexandru; Monchi, Oury

    2015-01-01

    The striatum has traditionally mainly been associated with playing a key role in the modulation of motor functions. Indeed, lesion studies in animals and studies of some neurological conditions in humans have brought further evidence to this idea. However, better methods of investigation have raised concerns about this notion, and it was proposed that the striatum could also be involved in different types of functions including cognitive ones. Although the notion was originally a matter of debate, it is now well-accepted that the caudate nucleus contributes to cognition, while the putamen could be involved in motor functions, and to some extent in cognitive functions as well. With the arrival of modern neuroimaging techniques in the early 1990, knowledge supporting the cognitive aspect of the striatum has greatly increased, and a substantial number of scientific papers were published studying the role of the striatum in healthy individuals. For the first time, it was possible to assess the contribution of specific areas of the brain during the execution of a cognitive task. Neuroanatomical studies have described functional loops involving the striatum and the prefrontal cortex suggesting a specific interaction between these two structures. This review examines the data up to date and provides strong evidence for a specific contribution of the fronto-striatal regions in different cognitive processes, such as set-shifting, self-initiated responses, rule learning, action-contingency, and planning. Finally, a new two-level functional model involving the prefrontal cortex and the dorsal striatum is proposed suggesting an essential role of the dorsal striatum in selecting between competing potential responses or actions, and in resolving a high level of ambiguity.

  1. Searching for consensus among physicians involved in the management of sick-listed workers in the Belgian health care sector: a qualitative study among practitioners and stakeholders.

    PubMed

    Vanmeerbeek, Marc; Govers, Patrick; Schippers, Nathalie; Rieppi, Stéphane; Mortelmans, Katrien; Mairiaux, Philippe

    2016-02-17

    In Belgium, the management of sick leave involves general practitioners (GPs), occupational health physicians (OPs) and social insurance physicians (SIPs). A dysfunctional relationship among these physicians can impede a patient's ability to return to work. The objective of this study was to identify ways to improve these physicians' mutual collaboration. Two consensus techniques were successively performed among the three professional groups. Eight nominal groups (NGs) gathered 74 field practitioners, and a two-round Delphi process involved 32 stakeholders. From the results, it appears that two areas (reciprocal knowledge and evolution of the legal and regulatory framework) are objects of consensus among the three medical group that were surveyed. Information transfer, particularly electronic transfer, was stressed as an important way to improve. The consensual proposals regarding interdisciplinary collaboration indicate specific and practical changes to be implemented when professionals are managing workers who are on sick leave. The collaboration process appeared to be currently more problematic, but the participants correctly identified the need for common training. The three physician groups all agree regarding several inter-physician collaboration proposals. The study also revealed a latent conflict situation among the analysed professionals that can arise from a lack of mutual recognition. Practical changes or improvements must be included in an extended framework that involves the different determinants of interdisciplinary collaboration that are shown by theoretical models. Collaboration is a product of the actions and behaviours of various partners, which requires reciprocal knowledge and trust; collaboration also implies political and economic structures that are led by public health authorities.

  2. Geant4 Developments for the Radon Electric Dipole Moment Search at TRIUMF

    NASA Astrophysics Data System (ADS)

    Rand, E. T.; Bangay, J. C.; Bianco, L.; Dunlop, R.; Finlay, P.; Garrett, P. E.; Leach, K. G.; Phillips, A. A.; Sumithrarachchi, C. S.; Svensson, C. E.; Wong, J.

    2011-09-01

    An experiment is being developed at TRIUMF to search for a time-reversal violating electric dipole moment (EDM) in odd-A isotopes of Rn. Extensive simulations of the experiment are being performed with GEANT4 to study the backgrounds and sensitivity of the proposed measurement technique involving the detection of γ rays emitted following the β decay of polarized Rn nuclei. GEANT4 developments for the RnEDM experiment include both realistic modelling of the detector geometry and full tracking of the radioactive β, γ, internal conversion, and x-ray processes, including the γ-ray angular distributions essential for measuring an atomic EDM.

  3. Examination of charge transfer in Au/YSZ for high-temperature optical gas sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baltrus, John P.; Ohodnicki, Paul R.

    2014-01-01

    Au-nanoparticle incorporated oxide thin film materials demonstrate significant promise as functionalsensor materials for high temperature optical gas sensing in severe environments relevant for fossil andnuclear based power generation. The Au/yttria-stabilized zirconia (YSZ) system has been extensivelystudied in the literature and serves as a model system for fundamental investigations that seek to betterunderstand the mechanistic origin of the plasmonic gas sensing response. In this work, X-ray photoelec-tron spectroscopy techniques are applied to Au/YSZ films in an attempt to provide further experimentalevidence for a proposed sensing mechanism involving a change in free carrier density of Au nanoparticles due to charge transfer.

  4. [Calcium in the developing skeletal muscles of the chick embryo].

    PubMed

    Samosudova, N V; Enenko, S O; Larin, Iu S; Shungskaia, V E

    1982-07-01

    The osmium-pyroantimonate technique was used for the ultrastructural study of Ca2+-localization in two types of chick embryo skeletal muscles: m. pectoralis and m. soleus. In 8- and 12-day old embryos the pyroantimonate precipitate was found on plasmalemma, condensed chromatine and ribosomes and in N-lines of I-band. During myogenesis (15-, 21-day old embryos) the calcium precipitate is redistributed from the above mentioned sites to terminal cisternae and N-line of I-band. It is proposed that calcium of N-lines may be involved in the glycogenolysis, its association with the muscle contraction occurring particularly at early developmental stages.

  5. Fiber Optic Sensor Components and Systems for Smart Materials and Structures

    NASA Technical Reports Server (NTRS)

    Lyons, R.

    1999-01-01

    The general objective of the funded research effort has been the development of discrete and distributed fiber sensors and fiber optic centered opto-electronic networks for the intelligent monitoring of phenomena in various aerospace structures related to NASA Marshall specific applications. In particular, we have proposed and have been developing technologies that we believe to be readily transferrable and which involve new fabrication techniques. The associated sensors developed can be incorporated into the matrix or on the surfaces of structures for the purpose of sensing stress, strain, temperature-both low and high, pressure field variations, phase changes, and the presence of various chemical constituents.

  6. Music, Mechanism, and the “Sonic Turn” in Physical Diagnosis

    PubMed Central

    Pesic, Peter

    2016-01-01

    The sonic diagnostic techniques of percussion and mediate auscultation advocated by Leopold von Auenbrugger and R. T. H. Laennec developed within larger musical contexts of practice, notation, and epistemology. Earlier, François-Nicolas Marquet proposed a musical notation of pulse that connected felt pulsation with heard music. Though contemporary vitalists rejected Marquet's work, mechanists such as Albrecht von Haller included it into the larger discourse about the physiological manifestations of bodily fluids and fibers. Educated in that mechanistic physiology, Auenbrugger used musical vocabulary to present his work on thoracic percussion; Laennec's musical experience shaped his exploration of the new timbres involved in mediate auscultation. PMID:26349757

  7. A laser technique for characterizing the geometry of plant canopies

    NASA Technical Reports Server (NTRS)

    Vanderbilt, V. C.; Silva, L. F.; Bauer, M. E.

    1977-01-01

    The interception of solar power by the canopy is investigated as a function of solar zenith angle (time), component of the canopy, and depth into the canopy. The projected foliage area, cumulative leaf area, and view factors within the canopy are examined as a function of the same parameters. Two systems are proposed that are capable of describing the geometrical aspects of a vegetative canopy and of operation in an automatic mode. Either system would provide sufficient data to yield a numerical map of the foliage area in the canopy. Both systems would involve the collection of large data sets in a short time period using minimal manpower.

  8. APPLICATION OF SPATIAL INFORMATION TECHNOLOGY TO PETROLEUM RESOURCE ASSESSMENT ANALYSIS.

    USGS Publications Warehouse

    Miller, Betty M.; Domaratz, Michael A.

    1984-01-01

    Petroleum resource assessment procedures require the analysis of a large volume of spatial data. The US Geological Survey (USGS) has developed and applied spatial information handling procedures and digital cartographic techniques to a recent study involving the assessment of oil and gas resource potential for 74 million acres of designated and proposed wilderness lands in the western United States. The part of the study which dealt with the application of spatial information technology to petroleum resource assessment procedures is reviewed. A method was designed to expedite the gathering, integrating, managing, manipulating and plotting of spatial data from multiple data sources that are essential in modern resource assessment procedures.

  9. A new neural network model for solving random interval linear programming problems.

    PubMed

    Arjmandzadeh, Ziba; Safi, Mohammadreza; Nazemi, Alireza

    2017-05-01

    This paper presents a neural network model for solving random interval linear programming problems. The original problem involving random interval variable coefficients is first transformed into an equivalent convex second order cone programming problem. A neural network model is then constructed for solving the obtained convex second order cone problem. Employing Lyapunov function approach, it is also shown that the proposed neural network model is stable in the sense of Lyapunov and it is globally convergent to an exact satisfactory solution of the original problem. Several illustrative examples are solved in support of this technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Speaker emotion recognition: from classical classifiers to deep neural networks

    NASA Astrophysics Data System (ADS)

    Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri

    2018-04-01

    Speaker emotion recognition is considered among the most challenging tasks in recent years. In fact, automatic systems for security, medicine or education can be improved when considering the speech affective state. In this paper, a twofold approach for speech emotion classification is proposed. At the first side, a relevant set of features is adopted, and then at the second one, numerous supervised training techniques, involving classic methods as well as deep learning, are experimented. Experimental results indicate that deep architecture can improve classification performance on two affective databases, the Berlin Dataset of Emotional Speech and the SAVEE Dataset Surrey Audio-Visual Expressed Emotion.

  11. Detection of OH on photolysis of styrene oxide at 193 nm in gas phase

    NASA Astrophysics Data System (ADS)

    Kumar, Awadhesh; SenGupta, Sumana; Pushpa, K. K.; Naik, P. D.; Bajaj, P. N.

    2006-10-01

    Photodissociation of styrene oxide at 193 nm in gas phase generates OH, as detected by laser-induced fluorescence technique. Under similar conditions, OH was not observed from ethylene and propylene oxides, primarily because of their low absorption cross-sections at 193 nm. Mechanism of OH formation involves first opening of the three-membered ring from the ground electronic state via cleavage of either of two C sbnd O bonds, followed by isomerization to enolic forms of phenylacetaldehyde and acetophenone, and finally scission of the C sbnd OH bond of enols. Ab initio molecular orbital calculations support the proposed mechanism.

  12. A Gradient-Field Pulsed Eddy Current Probe for Evaluation of Hidden Material Degradation in Conductive Structures Based on Lift-Off Invariance

    PubMed Central

    Li, Yong; Jing, Haoqing; Zainal Abidin, Ilham Mukriz; Yan, Bei

    2017-01-01

    Coated conductive structures are widely adopted in such engineering fields as aerospace, nuclear energy, etc. The hostile and corrosive environment leaves in-service coated conductive structures vulnerable to Hidden Material Degradation (HMD) occurring under the protection coating. It is highly demanded that HMD can be non-intrusively assessed using non-destructive evaluation techniques. In light of the advantages of Gradient-field Pulsed Eddy Current technique (GPEC) over other non-destructive evaluation methods in corrosion evaluation, in this paper the GPEC probe for quantitative evaluation of HMD is intensively investigated. Closed-form expressions of GPEC responses to HMD are formulated via analytical modeling. The Lift-off Invariance (LOI) in GPEC signals, which makes the HMD evaluation immune to the variation in thickness of the protection coating, is introduced and analyzed through simulations involving HMD with variable depths and conductivities. A fast inverse method employing magnitude and time of the LOI point in GPEC signals for simultaneously evaluating the conductivity and thickness of HMD region is proposed, and subsequently verified by finite element modeling and experiments. It has been found from the results that along with the proposed inverse method the GPEC probe is applicable to evaluation of HMD in coated conductive structures without much loss in accuracy. PMID:28441328

  13. Advanced engineering tools for design and fabrication of a custom nasal prosthesis

    NASA Astrophysics Data System (ADS)

    Oliveira, Inês; Leal, Nuno; Silva, Pedro; da Costa Ferreira, A.; Neto, Rui J.; Lino, F. Jorge; Reis, Ana

    2012-09-01

    Unexpected external defects resulting from neoplasms, burns, congenital malformations, trauma or other diseases, particularly when involving partial or total loss of an external organ, can be emotionally devastating. These defects can be restored with prosthesis, obtained by different techniques, materials and methods. The increase of patient numbers and cost constraints lead to the need of exploring new techniques that can increase efficiency. The main goal of this project was to develop a full engineering-based manufacturing process to obtain soft-tissue prosthesis that could provide faster and less expensive options in the manufacturing of customized prosthesis, and at the same time being able to reproduce the highest degree of details, with the maximum comfort for the patient. Design/methodology/approach - This case report describes treatment using silicone prosthesis with an anatomic retention for an 80-years-old woman with a rhinectomy. The proposed methodology integrates non-contact structured light scanning, CT and reverse engineering with CAD/CAM and additive manufacturing technology. Findings - The proposed protocol showed encouraging results since reveals being a better solution for fabricating custom-made facial prostheses for asymmetrical organs than conventional approaches. The process allows the attainment of prosthesis with the minimum contact and discomfort for the patient, disclosing excellent results in terms of aesthetic, prosthesis retention and in terms of time and resources consumed.

  14. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    PubMed Central

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  15. A Gradient-Field Pulsed Eddy Current Probe for Evaluation of Hidden Material Degradation in Conductive Structures Based on Lift-Off Invariance.

    PubMed

    Li, Yong; Jing, Haoqing; Zainal Abidin, Ilham Mukriz; Yan, Bei

    2017-04-25

    Coated conductive structures are widely adopted in such engineering fields as aerospace, nuclear energy, etc. The hostile and corrosive environment leaves in-service coated conductive structures vulnerable to Hidden Material Degradation (HMD) occurring under the protection coating. It is highly demanded that HMD can be non-intrusively assessed using non-destructive evaluation techniques. In light of the advantages of Gradient-field Pulsed Eddy Current technique (GPEC) over other non-destructive evaluation methods in corrosion evaluation, in this paper the GPEC probe for quantitative evaluation of HMD is intensively investigated. Closed-form expressions of GPEC responses to HMD are formulated via analytical modeling. The Lift-off Invariance (LOI) in GPEC signals, which makes the HMD evaluation immune to the variation in thickness of the protection coating, is introduced and analyzed through simulations involving HMD with variable depths and conductivities. A fast inverse method employing magnitude and time of the LOI point in GPEC signals for simultaneously evaluating the conductivity and thickness of HMD region is proposed, and subsequently verified by finite element modeling and experiments. It has been found from the results that along with the proposed inverse method the GPEC probe is applicable to evaluation of HMD in coated conductive structures without much loss in accuracy.

  16. Classified and clustered data constellation: An efficient approach of 3D urban data management

    NASA Astrophysics Data System (ADS)

    Azri, Suhaibah; Ujang, Uznir; Castro, Francesc Antón; Rahman, Alias Abdul; Mioc, Darka

    2016-03-01

    The growth of urban areas has resulted in massive urban datasets and difficulties handling and managing issues related to urban areas. Huge and massive datasets can degrade data retrieval and information analysis performance. In addition, the urban environment is very difficult to manage because it involves various types of data, such as multiple types of zoning themes in the case of urban mixed-use development. Thus, a special technique for efficient handling and management of urban data is necessary. This paper proposes a structure called Classified and Clustered Data Constellation (CCDC) for urban data management. CCDC operates on the basis of two filters: classification and clustering. To boost up the performance of information retrieval, CCDC offers a minimal percentage of overlap among nodes and coverage area to avoid repetitive data entry and multipath query. The results of tests conducted on several urban mixed-use development datasets using CCDC verify that it efficiently retrieves their semantic and spatial information. Further, comparisons conducted between CCDC and existing clustering and data constellation techniques, from the aspect of preservation of minimal overlap and coverage, confirm that the proposed structure is capable of preserving the minimum overlap and coverage area among nodes. Our overall results indicate that CCDC is efficient in handling and managing urban data, especially urban mixed-use development applications.

  17. Information hiding based on double random-phase encoding and public-key cryptography.

    PubMed

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  18. Estimation of bladder wall location in ultrasound images.

    PubMed

    Topper, A K; Jernigan, M E

    1991-05-01

    A method of automatically estimating the location of the bladder wall in ultrasound images is proposed. Obtaining this estimate is intended to be the first stage in the development of an automatic bladder volume calculation system. The first step in the bladder wall estimation scheme involves globally processing the images using standard image processing techniques to highlight the bladder wall. Separate processing sequences are required to highlight the anterior bladder wall and the posterior bladder wall. The sequence to highlight the anterior bladder wall involves Gaussian smoothing and second differencing followed by zero-crossing detection. Median filtering followed by thresholding and gradient detection is used to highlight as much of the rest of the bladder wall as was visible in the original images. Then a 'bladder wall follower'--a line follower with rules based on the characteristics of ultrasound imaging and the anatomy involved--is applied to the processed images to estimate the bladder wall location by following the portions of the bladder wall which are highlighted and filling in the missing segments. The results achieved using this scheme are presented.

  19. Neurophysiological mechanisms involved in language learning in adults

    PubMed Central

    Rodríguez-Fornells, Antoni; Cunillera, Toni; Mestres-Missé, Anna; de Diego-Balaguer, Ruth

    2009-01-01

    Little is known about the brain mechanisms involved in word learning during infancy and in second language acquisition and about the way these new words become stable representations that sustain language processing. In several studies we have adopted the human simulation perspective, studying the effects of brain-lesions and combining different neuroimaging techniques such as event-related potentials and functional magnetic resonance imaging in order to examine the language learning (LL) process. In the present article, we review this evidence focusing on how different brain signatures relate to (i) the extraction of words from speech, (ii) the discovery of their embedded grammatical structure, and (iii) how meaning derived from verbal contexts can inform us about the cognitive mechanisms underlying the learning process. We compile these findings and frame them into an integrative neurophysiological model that tries to delineate the major neural networks that might be involved in the initial stages of LL. Finally, we propose that LL simulations can help us to understand natural language processing and how the recovery from language disorders in infants and adults can be accomplished. PMID:19933142

  20. Nonlinear ultrasonic fatigue crack detection using a single piezoelectric transducer

    NASA Astrophysics Data System (ADS)

    An, Yun-Kyu; Lee, Dong Jun

    2016-04-01

    This paper proposes a new nonlinear ultrasonic technique for fatigue crack detection using a single piezoelectric transducer (PZT). The proposed technique identifies a fatigue crack using linear (α) and nonlinear (β) parameters obtained from only a single PZT mounted on a target structure. Based on the different physical characteristics of α and β, a fatigue crack-induced feature is able to be effectively isolated from the inherent nonlinearity of a target structure and data acquisition system. The proposed technique requires much simpler test setup and less processing costs than the existing nonlinear ultrasonic techniques, but fast and powerful. To validate the proposed technique, a real fatigue crack is created in an aluminum plate, and then false positive and negative tests are carried out under varying temperature conditions. The experimental results reveal that the fatigue crack is successfully detected, and no positive false alarm is indicated.

  1. Induction motor broken rotor bar fault location detection through envelope analysis of start-up current using Hilbert transform

    NASA Astrophysics Data System (ADS)

    Abd-el-Malek, Mina; Abdelsalam, Ahmed K.; Hassan, Ola E.

    2017-09-01

    Robustness, low running cost and reduced maintenance lead Induction Motors (IMs) to pioneerly penetrate the industrial drive system fields. Broken rotor bars (BRBs) can be considered as an important fault that needs to be early assessed to minimize the maintenance cost and labor time. The majority of recent BRBs' fault diagnostic techniques focus on differentiating between healthy and faulty rotor cage. In this paper, a new technique is proposed for detecting the location of the broken bar in the rotor. The proposed technique relies on monitoring certain statistical parameters estimated from the analysis of the start-up stator current envelope. The envelope of the signal is obtained using Hilbert Transformation (HT). The proposed technique offers non-invasive, fast computational and accurate location diagnostic process. Various simulation scenarios are presented that validate the effectiveness of the proposed technique.

  2. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    NASA Astrophysics Data System (ADS)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  3. Using Agent-Based Models to Develop Public Policy about Food Behaviours: Future Directions and Recommendations.

    PubMed

    Giabbanelli, Philippe J; Crutzen, Rik

    2017-01-01

    Most adults are overweight or obese in many western countries. Several population-level interventions on the physical, economical, political, or sociocultural environment have thus attempted to achieve a healthier weight. These interventions have involved different weight-related behaviours, such as food behaviours. Agent-based models (ABMs) have the potential to help policymakers evaluate food behaviour interventions from a systems perspective. However, fully realizing this potential involves a complex procedure starting with obtaining and analyzing data to populate the model and eventually identifying more efficient cross-sectoral policies. Current procedures for ABMs of food behaviours are mostly rooted in one technique, often ignore the food environment beyond home and work, and underutilize rich datasets. In this paper, we address some of these limitations to better support policymakers through two contributions. First, via a scoping review, we highlight readily available datasets and techniques to deal with these limitations independently. Second, we propose a three steps' process to tackle all limitations together and discuss its use to develop future models for food behaviours. We acknowledge that this integrated process is a leap forward in ABMs. However, this long-term objective is well-worth addressing as it can generate robust findings to effectively inform the design of food behaviour interventions.

  4. In vivo ultrasound imaging of the bone cortex

    NASA Astrophysics Data System (ADS)

    Renaud, Guillaume; Kruizinga, Pieter; Cassereau, Didier; Laugier, Pascal

    2018-06-01

    Current clinical ultrasound scanners cannot be used to image the interior morphology of bones because these scanners fail to address the complicated physics involved for exact image reconstruction. Here, we show that if the physics is properly addressed, bone cortex can be imaged using a conventional transducer array and a programmable ultrasound scanner. We provide in vivo proof for this technique by scanning the radius and tibia of two healthy volunteers and comparing the thickness of the radius bone with high-resolution peripheral x-ray computed tomography. Our method assumes a medium that is composed of different homogeneous layers with unique elastic anisotropy and ultrasonic wave-speed values. The applicable values of these layers are found by optimizing image sharpness and intensity over a range of relevant values. In the algorithm of image reconstruction we take wave refraction between the layers into account using a ray-tracing technique. The estimated values of the ultrasonic wave-speed and anisotropy in cortical bone are in agreement with ex vivo studies reported in the literature. These parameters are of interest since they were proposed as biomarkers for cortical bone quality. In this paper we discuss the physics involved with ultrasound imaging of bone and provide an algorithm to successfully image the first segment of cortical bone.

  5. Assessment of Robotic Patient Simulators for Training in Manual Physical Therapy Examination Techniques

    PubMed Central

    Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji

    2015-01-01

    Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard. PMID:25923719

  6. Solutions and debugging for data consistency in multiprocessors with noncoherent caches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, D.; Mendelson, B.; Breternitz, M. Jr.

    1995-02-01

    We analyze two important problems that arise in shared-memory multiprocessor systems. The stale data problem involves ensuring that data items in local memory of individual processors are current, independent of writes done by other processors. False sharing occurs when two processors have copies of the same shared data block but update different portions of the block. The false sharing problem involves guaranteeing that subsequent writes are properly combined. In modern architectures these problems are usually solved in hardware, by exploiting mechanisms for hardware controlled cache consistency. This leads to more expensive and nonscalable designs. Therefore, we are concentrating on softwaremore » methods for ensuring cache consistency that would allow for affordable and scalable multiprocessing systems. Unfortunately, providing software control is nontrivial, both for the compiler writer and for the application programmer. For this reason we are developing a debugging environment that will facilitate the development of compiler-based techniques and will help the programmer to tune his or her application using explicit cache management mechanisms. We extend the notion of a race condition for IBM Shared Memory System POWER/4, taking into consideration its noncoherent caches, and propose techniques for detection of false sharing problems. Identification of the stale data problem is discussed as well, and solutions are suggested.« less

  7. Modeling and control of distributed energy systems during transition between grid connected and standalone modes

    NASA Astrophysics Data System (ADS)

    Arafat, Md Nayeem

    Distributed generation systems (DGs) have been penetrating into our energy networks with the advancement in the renewable energy sources and energy storage elements. These systems can operate in synchronism with the utility grid referred to as the grid connected (GC) mode of operation, or work independently, referred to as the standalone (SA) mode of operation. There is a need to ensure continuous power flow during transition between GC and SA modes, referred to as the transition mode, in operating DGs. In this dissertation, efficient and effective transition control algorithms are developed for DGs operating either independently or collectively with other units. Three techniques are proposed in this dissertation to manage the proper transition operations. In the first technique, a new control algorithm is proposed for an independent DG which can operate in SA and GC modes. The proposed transition control algorithm ensures low total harmonic distortion (THD) and less voltage fluctuation during mode transitions compared to the other techniques. In the second technique, a transition control is suggested for a collective of DGs operating in a microgrid system architecture to improve the reliability of the system, reduce the cost, and provide better performance. In this technique, one of the DGs in a microgrid system, referred to as a dispatch unit , takes the additional responsibility of mode transitioning to ensure smooth transition and supply/demand balance in the microgrid. In the third technique, an alternative transition technique is proposed through hybridizing the current and droop controllers. The proposed hybrid transition control technique has higher reliability compared to the dispatch unit concept. During the GC mode, the proposed hybrid controller uses current control. During the SA mode, the hybrid controller uses droop control. During the transition mode, both of the controllers participate in formulating the inverter output voltage but with different weights or coefficients. Voltage source inverters interfacing the DGs as well as the proposed transition control algorithms have been modeled to analyze the stability of the algorithms in different configurations. The performances of the proposed algorithms are verified through simulation and experimental studies. It has been found that the proposed control techniques can provide smooth power flow to the local loads during the GC, SA and transition modes.

  8. Sensitivity analysis and approximation methods for general eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Murthy, D. V.; Haftka, R. T.

    1986-01-01

    Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.

  9. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    PubMed

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  10. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  11. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  12. Laser etching of polymer masked leadframes

    NASA Astrophysics Data System (ADS)

    Ho, C. K.; Man, H. C.; Yue, T. M.; Yuen, C. W.

    1997-02-01

    A typical electroplating production line for the deposition of silver pattern on copper leadframes in the semiconductor industry involves twenty to twenty five steps of cleaning, pickling, plating, stripping etc. This complex production process occupies large floor space and has also a number of problems such as difficulty in the production of rubber masks and alignment, generation of toxic fumes, high cost of water consumption and sometimes uncertainty on the cleanliness of the surfaces to be plated. A novel laser patterning process is proposed in this paper which can replace many steps in the existing electroplating line. The proposed process involves the application of high speed laser etching techniques on leadframes which were protected with polymer coating. The desired pattern for silver electroplating is produced by laser ablation of the polymer coating. Excimer laser was found to be most effective for this process as it can expose a pattern of clean copper substrate which can be silver plated successfully. Previous working of Nd:YAG laser ablation showed that 1.06 μm radiation was not suitable for this etching process because a thin organic and transparent film remained on the laser etched region. The effect of excimer pulse frequency and energy density upon the removal rate of the polymer coating was studied.

  13. X-ray based extensometry

    NASA Technical Reports Server (NTRS)

    Jordan, E. H.; Pease, D. M.

    1988-01-01

    A totally new method of extensometry using an X-ray beam was proposed. The intent of the method is to provide a non-contacting technique that is immune to problems associated with density variations in gaseous environments that plague optical methods. X-rays are virtually unrefractable even by solids. The new method utilizes X-ray induced X-ray fluorescence or X-ray induced optical fluorescence of targets that have melting temperatures of over 3000 F. Many different variations of the basic approaches are possible. In the year completed, preliminary experiments were completed which strongly suggest that the method is feasible. The X-ray induced optical fluorescence method appears to be limited to temperatures below roughly 1600 F because of the overwhelming thermal optical radiation. The X-ray induced X-ray fluorescence scheme appears feasible up to very high temperatures. In this system there will be an unknown tradeoff between frequency response, cost, and accuracy. The exact tradeoff can only be estimated. It appears that for thermomechanical tests with cycle times on the order of minutes a very reasonable system may be feasible. The intended applications involve very high temperatures in both materials testing and monitoring component testing. Gas turbine engines, rocket engines, and hypersonic vehicles (NASP) all involve measurement needs that could partially be met by the proposed technology.

  14. Mapping care processes within a hospital: from theory to a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2005-03-01

    Today, the economic and regulatory environment, involving activity-based and prospective payment systems, healthcare quality and risk analysis, traceability of the acts performed and evaluation of care practices, accounts for the current interest in clinical and hospital information systems. The structured gathering of information relative to users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysts. We used a modelling technique designed for manufacturing processes (IDEF0/SADT). We enhanced the basic model of an activity with descriptors extracted from the Ishikawa cause-and-effect diagram (methods, men, materials, machines, and environment). We proposed an object data model of a process and its components, and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary of a given process from the description of its elements and to locate documents (procedures, recommendations, instructions) according to each activity or role. Aimed at structuring needs and storing information provided by directly involved teams regarding the workings of an institution (or at least part of it), the process-mapping approach has an important contribution to make in the analysis of clinical information systems.

  15. A robust star identification algorithm with star shortlisting

    NASA Astrophysics Data System (ADS)

    Mehta, Deval Samirbhai; Chen, Shoushun; Low, Kay Soon

    2018-05-01

    A star tracker provides the most accurate attitude solution in terms of arc seconds compared to the other existing attitude sensors. When no prior attitude information is available, it operates in "Lost-In-Space (LIS)" mode. Star pattern recognition, also known as star identification algorithm, forms the most crucial part of a star tracker in the LIS mode. Recognition reliability and speed are the two most important parameters of a star pattern recognition technique. In this paper, a novel star identification algorithm with star ID shortlisting is proposed. Firstly, the star IDs are shortlisted based on worst-case patch mismatch, and later stars are identified in the image by an initial match confirmed with a running sequential angular match technique. The proposed idea is tested on 16,200 simulated star images having magnitude uncertainty, noise stars, positional deviation, and varying size of the field of view. The proposed idea is also benchmarked with the state-of-the-art star pattern recognition techniques. Finally, the real-time performance of the proposed technique is tested on the 3104 real star images captured by a star tracker SST-20S currently mounted on a satellite. The proposed technique can achieve an identification accuracy of 98% and takes only 8.2 ms for identification on real images. Simulation and real-time results depict that the proposed technique is highly robust and achieves a high speed of identification suitable for actual space applications.

  16. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  17. Poisson and negative binomial item count techniques for surveys with sensitive question.

    PubMed

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  18. Visibility enhancement of color images using Type-II fuzzy membership function

    NASA Astrophysics Data System (ADS)

    Singh, Harmandeep; Khehra, Baljit Singh

    2018-04-01

    Images taken in poor environmental conditions decrease the visibility and hidden information of digital images. Therefore, image enhancement techniques are necessary for improving the significant details of these images. An extensive review has shown that histogram-based enhancement techniques greatly suffer from over/under enhancement issues. Fuzzy-based enhancement techniques suffer from over/under saturated pixels problems. In this paper, a novel Type-II fuzzy-based image enhancement technique has been proposed for improving the visibility of images. The Type-II fuzzy logic can automatically extract the local atmospheric light and roughly eliminate the atmospheric veil in local detail enhancement. The proposed technique has been evaluated on 10 well-known weather degraded color images and is also compared with four well-known existing image enhancement techniques. The experimental results reveal that the proposed technique outperforms others regarding visible edge ratio, color gradients and number of saturated pixels.

  19. Rapid customization system for 3D-printed splint using programmable modeling technique - a practical approach.

    PubMed

    Li, Jianyou; Tanaka, Hiroya

    2018-01-01

    Traditional splinting processes are skill dependent and irreversible, and patient satisfaction levels during rehabilitation are invariably lowered by the heavy structure and poor ventilation of splints. To overcome this drawback, use of the 3D-printing technology has been proposed in recent years, and there has been an increase in public awareness. However, application of 3D-printing technologies is limited by the low CAD proficiency of clinicians as well as unforeseen scan flaws within anatomic models.A programmable modeling tool has been employed to develop a semi-automatic design system for generating a printable splint model. The modeling process was divided into five stages, and detailed steps involved in construction of the proposed system as well as automatic thickness calculation, the lattice structure, and assembly method have been thoroughly described. The proposed approach allows clinicians to verify the state of the splint model at every stage, thereby facilitating adjustment of input content and/or other parameters to help solve possible modeling issues. A finite element analysis simulation was performed to evaluate the structural strength of generated models. A fit investigation was applied on fabricated splints and volunteers to assess the wearing experience. Manual modeling steps involved in complex splint designs have been programed into the proposed automatic system. Clinicians define the splinting region by drawing two curves, thereby obtaining the final model within minutes. The proposed system is capable of automatically patching up minor flaws within the limb model as well as calculating the thickness and lattice density of various splints. Large splints could be divided into three parts for simultaneous multiple printing. This study highlights the advantages, limitations, and possible strategies concerning application of programmable modeling tools in clinical processes, thereby aiding clinicians with lower CAD proficiencies to become adept with splint design process, thus improving the overall design efficiency of 3D-printed splints.

  20. Proposal for a new trajectory for subaxial cervical lateral mass screws.

    PubMed

    Amhaz-Escanlar, Samer; Jorge-Mora, Alberto; Jorge-Mora, Teresa; Febrero-Bande, Manuel; Diez-Ulloa, Maximo-Alberto

    2018-06-20

    Lateral mass screws combined with rods are the standard method for posterior cervical spine subaxial fixation. Several techniques have been described, among which the most used are Roy Camille, Magerl, Anderson and An. All of them are based on tridimensional angles. Reliability of freehand angle estimation remains poorly investigated. We propose a new technique based on on-site spatial references and compare it with previously described ones assessing screw length and neurovascular potential complications. Four different lateral mass screw insertion techniques (Magerl, Anderson, An and the new described technique) were performed bilaterally, from C3 to C6, in ten human spine specimens. A drill tip guide wire was inserted as originally described for each trajectory, and screw length was measured. Exit point was examined, and potential vertebral artery or nerve root injury was assessed. Mean screw length was 14.05 mm using Magerl's technique, 13.47 mm using Anderson's, 12.8 mm using An's and 17.03 mm using the new technique. Data analysis showed significantly longer lateral mass screw length using the new technique (p value < 0.00001). Nerve potential injury occurred 37 times using Magerl's technique, 28 using Anderson's, 13 using An's and twice using the new technique. Vertebral artery potential injury occurred once using Magerl's technique, 8 times using Anderson's and none using either An's or the new proposed technique. The risk of neurovascular complication was significantly lower using the new technique (p value < 0.01). The new proposed technique allows for longer screws, maximizing purchase and stability, while lowering the complication rate.

  1. Wavelet Transform Based Filter to Remove the Notches from Signal Under Harmonic Polluted Environment

    NASA Astrophysics Data System (ADS)

    Das, Sukanta; Ranjan, Vikash

    2017-12-01

    The work proposes to annihilate the notches present in the synchronizing signal required for converter operation appearing due to switching of semiconductor devices connected to the system in the harmonic polluted environment. The disturbances in the signal are suppressed by wavelet based novel filtering technique. In the proposed technique, the notches in the signal are determined and eliminated by the wavelet based multi-rate filter using `Daubechies4' (db4) as mother wavelet. The computational complexity of the adapted technique is very less as compared to any other conventional notch filtering techniques. The proposed technique is developed in MATLAB/Simulink and finally validated with dSPACE-1103 interface. The recovered signal, thus obtained, is almost free of the notches.

  2. Techniques for Computation of Frequency Limited H∞ Norm

    NASA Astrophysics Data System (ADS)

    Haider, Shafiq; Ghafoor, Abdul; Imran, Muhammad; Fahad Mumtaz, Malik

    2018-01-01

    Traditional H ∞ norm depicts peak system gain over infinite frequency range, but many applications like filter design, model order reduction and controller design etc. require computation of peak system gain over specific frequency interval rather than infinite range. In present work, new computationally efficient techniques for computation of H ∞ norm over frequency limited interval are proposed. Proposed techniques link norm computation with maximum singular value of the system in limited frequency interval. Numerical examples are incorporated to validate the proposed concept.

  3. EXTENSION METHODS IDEAS FOR RURAL CIVIL DEFENSE.

    ERIC Educational Resources Information Center

    Department of Agriculture, Washington, DC.

    TECHNIQUES FOR INVOLVING THE RURAL POPULATION IN CIVIL DEFENSE PLANNING IS THE SUBJECT OF THIS DOCUMENT. AN INITIAL STEP INVOLVES DETERMINING THE VARIOUS COMMUNICATION SKILLS TO BE USED. METHODS OF WORKING WITH COMMUNITY ORGANIZATIONS, MASS MEDIA TECHNIQUES, AND CONSTRUCTION OF EXHIBITS ARE DESCRIBED. SMALL GROUP DISCUSSION TECHNIQUES EXPLAINED…

  4. Process techniques of charge transfer time reduction for high speed CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Zhongxiang, Cao; Quanliang, Li; Ye, Han; Qi, Qin; Peng, Feng; Liyuan, Liu; Nanjian, Wu

    2014-11-01

    This paper proposes pixel process techniques to reduce the charge transfer time in high speed CMOS image sensors. These techniques increase the lateral conductivity of the photo-generated carriers in a pinned photodiode (PPD) and the voltage difference between the PPD and the floating diffusion (FD) node by controlling and optimizing the N doping concentration in the PPD and the threshold voltage of the reset transistor, respectively. The techniques shorten the charge transfer time from the PPD diode to the FD node effectively. The proposed process techniques do not need extra masks and do not cause harm to the fill factor. A sub array of 32 × 64 pixels was designed and implemented in the 0.18 μm CIS process with five implantation conditions splitting the N region in the PPD. The simulation and measured results demonstrate that the charge transfer time can be decreased by using the proposed techniques. Comparing the charge transfer time of the pixel with the different implantation conditions of the N region, the charge transfer time of 0.32 μs is achieved and 31% of image lag was reduced by using the proposed process techniques.

  5. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  6. Application of hierarchical cascading technique to finite element method simulation in bulk acoustic wave devices

    NASA Astrophysics Data System (ADS)

    Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya

    2018-07-01

    In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.

  7. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  8. Sparsity-aware multiple relay selection in large multi-hop decode-and-forward relay networks

    NASA Astrophysics Data System (ADS)

    Gouissem, A.; Hamila, R.; Al-Dhahir, N.; Foufou, S.

    2016-12-01

    In this paper, we propose and investigate two novel techniques to perform multiple relay selection in large multi-hop decode-and-forward relay networks. The two proposed techniques exploit sparse signal recovery theory to select multiple relays using the orthogonal matching pursuit algorithm and outperform state-of-the-art techniques in terms of outage probability and computation complexity. To reduce the amount of collected channel state information (CSI), we propose a limited-feedback scheme where only a limited number of relays feedback their CSI. Furthermore, a detailed performance-complexity tradeoff investigation is conducted for the different studied techniques and verified by Monte Carlo simulations.

  9. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  10. Trend extraction using empirical mode decomposition and statistical empirical mode decomposition: Case study: Kuala Lumpur stock market

    NASA Astrophysics Data System (ADS)

    Jaber, Abobaker M.

    2014-12-01

    Two nonparametric methods for prediction and modeling of financial time series signals are proposed. The proposed techniques are designed to handle non-stationary and non-linearity behave and to extract meaningful signals for reliable prediction. Due to Fourier Transform (FT), the methods select significant decomposed signals that will be employed for signal prediction. The proposed techniques developed by coupling Holt-winter method with Empirical Mode Decomposition (EMD) and it is Extending the scope of empirical mode decomposition by smoothing (SEMD). To show performance of proposed techniques, we analyze daily closed price of Kuala Lumpur stock market index.

  11. One-Step Sub-micrometer-Scale Electrohydrodynamic Inkjet Three-Dimensional Printing Technique with Spontaneous Nanoscale Joule Heating.

    PubMed

    Zhang, Bin; Seong, Baekhoon; Lee, Jaehyun; Nguyen, VuDat; Cho, Daehyun; Byun, Doyoung

    2017-09-06

    A one-step sub-micrometer-scale electrohydrodynamic (EHD) inkjet three-dimensional (3D)-printing technique that is based on the drop-on-demand (DOD) operation for which an additional postsintering process is not required is proposed. Both the numerical simulation and the experimental observations proved that nanoscale Joule heating occurs at the interface between the charged silver nanoparticles (Ag-NPs) because of the high electrical contact resistance during the printing process; this is the reason why an additional postsintering process is not required. Sub-micrometer-scale 3D structures were printed with an above-35 aspect ratio via the use of the proposed printing technique; furthermore, it is evident that the designed 3D structures such as a bridge-like shape can be printed with the use of the proposed printing technique, allowing for the cost-effective fabrication of a 3D touch sensor and an ultrasensitive air flow-rate sensor. It is believed that the proposed one-step printing technique may replace the conventional 3D conductive-structure printing techniques for which a postsintering process is used because of its economic efficiency.

  12. Field Calibration of Wind Direction Sensor to the True North and Its Application to the Daegwanryung Wind Turbine Test Sites

    PubMed Central

    Lee, Jeong Wan

    2008-01-01

    This paper proposes a field calibration technique for aligning a wind direction sensor to the true north. The proposed technique uses the synchronized measurements of captured images by a camera, and the output voltage of a wind direction sensor. The true wind direction was evaluated through image processing techniques using the captured picture of the sensor with the least square sense. Then, the evaluated true value was compared with the measured output voltage of the sensor. This technique solves the discordance problem of the wind direction sensor in the process of installing meteorological mast. For this proposed technique, some uncertainty analyses are presented and the calibration accuracy is discussed. Finally, the proposed technique was applied to the real meteorological mast at the Daegwanryung test site, and the statistical analysis of the experimental testing estimated the values of stable misalignment and uncertainty level. In a strict sense, it is confirmed that the error range of the misalignment from the exact north could be expected to decrease within the credibility level. PMID:27873957

  13. Unified multiphase modeling for evolving, acoustically coupled systems consisting of acoustic, elastic, poroelastic media and septa

    NASA Astrophysics Data System (ADS)

    Lee, Joong Seok; Kang, Yeon June; Kim, Yoon Young

    2012-12-01

    This paper presents a new modeling technique that can represent acoustically coupled systems in a unified manner. The proposed unified multiphase (UMP) modeling technique uses Biot's equations that are originally derived for poroelastic media to represent not only poroelastic media but also non-poroelastic ones ranging from acoustic and elastic media to septa. To recover the original vibro-acoustic behaviors of non-poroelastic media, material parameters of a base poroelastic medium are adjusted depending on the target media. The real virtue of this UMP technique is that interface coupling conditions between any media can be automatically satisfied, so no medium-dependent interface condition needs to be imposed explicitly. Thereby, the proposed technique can effectively model any acoustically coupled system having locally varying medium phases and evolving interfaces. A typical situation can occur in an iterative design process. Because the proposed UMP modeling technique needs theoretical justifications for further development, this work is mainly focused on how the technique recovers the governing equations of non-poroelastic media and expresses their interface conditions. We also address how to describe various boundary conditions of the media in the technique. Some numerical studies are carried out to demonstrate the validity of the proposed modeling technique.

  14. New potentiometric and spectrophotometric methods for the determination of dextromethorphan in pharmaceutical preparations.

    PubMed

    Elmosallamy, Mohamed A F; Amin, Alaa S

    2014-01-01

    New, simple and convenient potentiometric and spectrophotometric methods are described for the determination of dextromethorphan hydrobromide (DXM) in pharmaceutical preparations. The potentiometric technique is based on developing a potentiometric sensor incorporating the dextromethorphan tetrakis(p-chlorophenyl)borate ion-pair complex as an electroactive species in a plasticized PVC matrix membrane with o-nitophenyl octyl ether or dioctyl phthalate. The sensor shows a rapid near Nernstian response of over 1 × 10(-5) - 1 × 10(-2) mol L(-1) dextromethorphan in the pH range of 3.0 - 9.0. The detection limit is 2 × 10(-6) mol L(-1) DXM and the response time is instantaneous (2 s). The proposed spectrophotometric technique involves the reaction of DXM with eriochrom black T (EBT) to form an ion-associate complex. Solvent extraction is used to improve the selectivity of the method. The optimal extraction and reaction conditions have been studied, and the analytical characteristics of the method have been obtained. Linearity is obeyed in the range of 7.37 - 73.7 × 10(-5) mol L(-1) DXM, and the detection limit of the method is 1.29 × 10(-5) mol L(-1). The relative standard deviation (RSD) and relative error for six replicate measurements of 3.685 × 10(-4) mol L(-1) are 0.672 and 0.855%, respectively. The interference effect of some excepients has also been tested. The drug contents in pharmaceutical preparations were successfully determined by the proposed methods by applying the standard-addition technique.

  15. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  16. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  17. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  18. Quantitative determination of salbutamol sulfate impurities using achiral supercritical fluid chromatography.

    PubMed

    Dispas, Amandine; Desfontaine, Vincent; Andri, Bertyl; Lebrun, Pierre; Kotoni, Dorina; Clarke, Adrian; Guillarme, Davy; Hubert, Philippe

    2017-02-05

    In the last years, supercritical fluid chromatography has largely been acknowledged as a singular and performing technique in the field of separation sciences. Recent studies highlighted the interest of SFC for the quality control of pharmaceuticals, especially in the case of the determination of the active pharmaceutical ingredient (API). Nevertheless, quality control requires also the determination of impurities. The objectives of the present work were to (i) demonstrate the interest of SFC as a reference technique for the determination of impurities in salbutamol sulfate API and (ii) to propose an alternative to a reference HPLC method from the European Pharmacopeia (EP) involving ion-pairing reagent. Firstly, a screening was carried out to select the most adequate and selective stationary phase. Secondly, in the context of robust optimization strategy, the method was developed using design space methodology. The separation of salbutamol sulfate and related impurities was achieved in 7min, which is seven times faster than the LC-UV method proposed by European Pharmacopeia (total run time of 50min). Finally, full validation using accuracy profile approach was successfully achieved for the determination of impurities B, D, F and G in salbutamol sulfate raw material. The validated dosing range covered 50 to 150% of the targeted concentration (corresponding to 0.3% concentration level), LODs close to 0.5μg/mL were estimated. The SFC method proposed in this study could be presented as a suitable fast alternative to EP LC method for the quantitative determination of salbutamol impurities. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. A Novel Feature Extraction Method with Feature Selection to Identify Golgi-Resident Protein Types from Imbalanced Data

    PubMed Central

    Yang, Runtao; Zhang, Chengjin; Gao, Rui; Zhang, Lina

    2016-01-01

    The Golgi Apparatus (GA) is a major collection and dispatch station for numerous proteins destined for secretion, plasma membranes and lysosomes. The dysfunction of GA proteins can result in neurodegenerative diseases. Therefore, accurate identification of protein subGolgi localizations may assist in drug development and understanding the mechanisms of the GA involved in various cellular processes. In this paper, a new computational method is proposed for identifying cis-Golgi proteins from trans-Golgi proteins. Based on the concept of Common Spatial Patterns (CSP), a novel feature extraction technique is developed to extract evolutionary information from protein sequences. To deal with the imbalanced benchmark dataset, the Synthetic Minority Over-sampling Technique (SMOTE) is adopted. A feature selection method called Random Forest-Recursive Feature Elimination (RF-RFE) is employed to search the optimal features from the CSP based features and g-gap dipeptide composition. Based on the optimal features, a Random Forest (RF) module is used to distinguish cis-Golgi proteins from trans-Golgi proteins. Through the jackknife cross-validation, the proposed method achieves a promising performance with a sensitivity of 0.889, a specificity of 0.880, an accuracy of 0.885, and a Matthew’s Correlation Coefficient (MCC) of 0.765, which remarkably outperforms previous methods. Moreover, when tested on a common independent dataset, our method also achieves a significantly improved performance. These results highlight the promising performance of the proposed method to identify Golgi-resident protein types. Furthermore, the CSP based feature extraction method may provide guidelines for protein function predictions. PMID:26861308

  20. Application of a real-space three-dimensional image reconstruction method in the structural analysis of noncrystalline biological macromolecules enveloped by water in coherent x-ray diffraction microscopy.

    PubMed

    Kodama, Wataru; Nakasako, Masayoshi

    2011-08-01

    Coherent x-ray diffraction microscopy is a novel technique in the structural analyses of particles that are difficult to crystallize, such as the biological particles composing living cells. As water is indispensable for maintaining particles in functional structures, sufficient hydration of targeted particles is required during sample preparation for diffraction microscopy experiments. However, the water enveloping particles also contributes significantly to the diffraction patterns and reduces the electron-density contrast of the sample particles. In this study, we propose a protocol for the structural analyses of particles in water by applying a three-dimensional reconstruction method in real space for the projection images phase-retrieved from diffraction patterns, together with a developed density modification technique. We examined the feasibility of the protocol through three simulations involving a protein molecule in a vacuum, and enveloped in either a droplet or a cube-shaped water. The simulations were carried out for the diffraction patterns in the reciprocal planes normal to the incident x-ray beam. This assumption and the simulation conditions corresponded to experiments using x-ray wavelengths of shorter than 0.03 Å. The analyses demonstrated that our protocol provided an interpretable electron-density map. Based on the results, we discuss the advantages and limitations of the proposed protocol and its practical application for experimental data. In particular, we examined the influence of Poisson noise in diffraction patterns on the reconstructed three-dimensional electron density in the proposed protocol.

Top